Apr 20 10:24:05 user nova-compute[71283]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. Apr 20 10:24:08 user nova-compute[71283]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=71283) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=71283) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=71283) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 20 10:24:08 user nova-compute[71283]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.018s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:24:08 user nova-compute[71283]: INFO nova.virt.driver [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] Loading compute driver 'libvirt.LibvirtDriver' Apr 20 10:24:08 user nova-compute[71283]: INFO nova.compute.provider_config [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] Acquiring lock "singleton_lock" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] Acquired lock "singleton_lock" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] Releasing lock "singleton_lock" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] Full set of CONF: {{(pid=71283) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ******************************************************************************** {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] Configuration options gathered from: {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] command line args: ['--config-file', '/etc/nova/nova-cpu.conf'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] config files: ['/etc/nova/nova-cpu.conf'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ================================================================================ {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] allow_resize_to_same_host = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] arq_binding_timeout = 300 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] backdoor_port = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] backdoor_socket = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] block_device_allocate_retries = 300 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] block_device_allocate_retries_interval = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cert = self.pem {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute_driver = libvirt.LibvirtDriver {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute_monitors = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] config_dir = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] config_drive_format = iso9660 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] config_file = ['/etc/nova/nova-cpu.conf'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] config_source = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] console_host = user {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] control_exchange = nova {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cpu_allocation_ratio = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] daemon = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] debug = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] default_access_ip_network_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] default_availability_zone = nova {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] default_ephemeral_format = ext4 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] default_schedule_zone = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] disk_allocation_ratio = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] enable_new_services = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] enabled_apis = ['osapi_compute'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] enabled_ssl_apis = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] flat_injected = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] force_config_drive = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] force_raw_images = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:08 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] graceful_shutdown_timeout = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] heal_instance_info_cache_interval = 60 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] host = user {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] initial_cpu_allocation_ratio = 4.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] initial_disk_allocation_ratio = 1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] initial_ram_allocation_ratio = 1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] instance_build_timeout = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] instance_delete_interval = 300 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] instance_format = [instance: %(uuid)s] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] instance_name_template = instance-%08x {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] instance_usage_audit = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] instance_usage_audit_period = month {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] instances_path = /opt/stack/data/nova/instances {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] internal_service_availability_zone = internal {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] key = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] live_migration_retry_count = 30 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] log_config_append = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] log_dir = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] log_file = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] log_options = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] log_rotate_interval = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] log_rotate_interval_type = days {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] log_rotation_type = none {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] long_rpc_timeout = 1800 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] max_concurrent_builds = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] max_concurrent_live_migrations = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] max_concurrent_snapshots = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] max_local_block_devices = 3 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] max_logfile_count = 30 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] max_logfile_size_mb = 200 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] maximum_instance_delete_attempts = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] metadata_listen = 0.0.0.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] metadata_listen_port = 8775 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] metadata_workers = 3 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] migrate_max_retries = -1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] mkisofs_cmd = genisoimage {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] my_block_storage_ip = 10.0.0.210 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] my_ip = 10.0.0.210 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] network_allocate_retries = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] osapi_compute_listen = 0.0.0.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] osapi_compute_listen_port = 8774 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] osapi_compute_unique_server_name_scope = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] osapi_compute_workers = 3 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] password_length = 12 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] periodic_enable = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] periodic_fuzzy_delay = 60 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] pointer_model = ps2mouse {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] preallocate_images = none {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] publish_errors = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] pybasedir = /opt/stack/nova {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ram_allocation_ratio = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] rate_limit_burst = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] rate_limit_except_level = CRITICAL {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] rate_limit_interval = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] reboot_timeout = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] reclaim_instance_interval = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] record = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] reimage_timeout_per_gb = 20 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] report_interval = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] rescue_timeout = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] reserved_host_cpus = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] reserved_host_disk_mb = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] reserved_host_memory_mb = 512 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] reserved_huge_pages = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] resize_confirm_window = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] resize_fs_using_block_device = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] resume_guests_state_on_host_boot = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] rpc_response_timeout = 60 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] run_external_periodic_tasks = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] running_deleted_instance_action = reap {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] running_deleted_instance_poll_interval = 1800 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] running_deleted_instance_timeout = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] scheduler_instance_sync_interval = 120 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] service_down_time = 60 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] servicegroup_driver = db {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] shelved_offload_time = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] shelved_poll_interval = 3600 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] shutdown_timeout = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] source_is_ipv6 = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ssl_only = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] state_path = /opt/stack/data/nova {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] sync_power_state_interval = 600 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] sync_power_state_pool_size = 1000 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] syslog_log_facility = LOG_USER {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] tempdir = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] timeout_nbd = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] transport_url = **** {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] update_resources_interval = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] use_cow_images = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] use_eventlog = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] use_journal = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] use_json = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] use_rootwrap_daemon = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] use_stderr = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] use_syslog = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vcpu_pin_set = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plugging_is_fatal = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plugging_timeout = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] virt_mkfs = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] volume_usage_poll_interval = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] watch_log_file = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] web = /usr/share/spice-html5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_concurrency.disable_process_locking = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_metrics.metrics_process_name = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.auth_strategy = keystone {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.compute_link_prefix = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.dhcp_domain = novalocal {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.enable_instance_password = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.glance_link_prefix = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.instance_list_cells_batch_strategy = distributed {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.instance_list_per_project_cells = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.list_records_by_skipping_down_cells = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.local_metadata_per_cell = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.max_limit = 1000 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.metadata_cache_expiration = 15 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.neutron_default_tenant_id = default {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.use_forwarded_for = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.use_neutron_default_nets = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.vendordata_dynamic_failure_fatal = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.vendordata_dynamic_ssl_certfile = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.vendordata_dynamic_targets = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.vendordata_jsonfile_path = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api.vendordata_providers = ['StaticJSON'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.backend = dogpile.cache.memcached {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.backend_argument = **** {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.config_prefix = cache.oslo {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.dead_timeout = 60.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.debug_cache_backend = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.enable_retry_client = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.enable_socket_keepalive = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.enabled = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.expiration_time = 600 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.hashclient_retry_attempts = 2 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.hashclient_retry_delay = 1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.memcache_dead_retry = 300 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.memcache_password = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.memcache_pool_maxsize = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.memcache_pool_unused_timeout = 60 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.memcache_sasl_enabled = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.memcache_servers = ['localhost:11211'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.memcache_socket_timeout = 1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.memcache_username = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.proxies = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.retry_attempts = 2 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.retry_delay = 0.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.socket_keepalive_count = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.socket_keepalive_idle = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.socket_keepalive_interval = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.tls_allowed_ciphers = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.tls_cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.tls_certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.tls_enabled = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cache.tls_keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.auth_section = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.auth_type = password {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.catalog_info = volumev3::publicURL {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.collect_timing = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.cross_az_attach = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.debug = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.endpoint_template = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.http_retries = 3 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.os_region_name = RegionOne {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.split_loggers = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cinder.timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute.cpu_dedicated_set = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute.cpu_shared_set = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute.image_type_exclude_list = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute.live_migration_wait_for_vif_plug = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute.max_concurrent_disk_ops = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute.max_disk_devices_to_attach = -1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute.resource_provider_association_refresh = 300 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute.shutdown_retry_interval = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] conductor.workers = 3 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] console.allowed_origins = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] console.ssl_ciphers = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] console.ssl_minimum_version = default {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] consoleauth.token_ttl = 600 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.collect_timing = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.connect_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.connect_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.endpoint_override = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.max_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.min_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.region_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.service_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.service_type = accelerator {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.split_loggers = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.status_code_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.status_code_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] cyborg.version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.backend = sqlalchemy {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.connection = **** {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.connection_debug = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.connection_parameters = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.connection_recycle_time = 3600 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.connection_trace = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.db_inc_retry_interval = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.db_max_retries = 20 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.db_max_retry_interval = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.db_retry_interval = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.max_overflow = 50 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.max_pool_size = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.max_retries = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.mysql_enable_ndb = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.mysql_sql_mode = TRADITIONAL {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.mysql_wsrep_sync_wait = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.pool_timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.retry_interval = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.slave_connection = **** {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] database.sqlite_synchronous = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.backend = sqlalchemy {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.connection = **** {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.connection_debug = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.connection_parameters = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.connection_recycle_time = 3600 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.connection_trace = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.db_inc_retry_interval = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.db_max_retries = 20 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.db_max_retry_interval = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.db_retry_interval = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.max_overflow = 50 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.max_pool_size = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.max_retries = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.mysql_enable_ndb = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.mysql_wsrep_sync_wait = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.pool_timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.retry_interval = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.slave_connection = **** {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] api_database.sqlite_synchronous = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] devices.enabled_mdev_types = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ephemeral_storage_encryption.enabled = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ephemeral_storage_encryption.key_size = 512 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.api_servers = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.collect_timing = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.connect_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.connect_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.debug = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.default_trusted_certificate_ids = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.enable_certificate_validation = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.enable_rbd_download = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.endpoint_override = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.max_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.min_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.num_retries = 3 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.rbd_ceph_conf = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.rbd_connect_timeout = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.rbd_pool = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.rbd_user = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.region_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.service_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.service_type = image {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.split_loggers = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.status_code_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.status_code_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.verify_glance_signatures = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] glance.version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] guestfs.debug = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.config_drive_cdrom = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.config_drive_inject_password = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.enable_instance_metrics_collection = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.enable_remotefx = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.instances_path_share = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.iscsi_initiator_list = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.limit_cpu_features = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.power_state_check_timeframe = 60 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.power_state_event_polling_interval = 2 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.use_multipath_io = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.volume_attach_retry_count = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.volume_attach_retry_interval = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.vswitch_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] mks.enabled = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] image_cache.manager_interval = 2400 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] image_cache.precache_concurrency = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] image_cache.remove_unused_base_images = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] image_cache.subdirectory_name = _base {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.api_max_retries = 60 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.api_retry_interval = 2 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.auth_section = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.auth_type = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.collect_timing = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.connect_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.connect_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.endpoint_override = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.max_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.min_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.partition_key = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.peer_list = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.region_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.serial_console_state_timeout = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.service_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.service_type = baremetal {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.split_loggers = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.status_code_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.status_code_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ironic.version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] key_manager.fixed_key = **** {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.barbican_api_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.barbican_endpoint = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.barbican_endpoint_type = public {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.barbican_region_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.collect_timing = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.number_of_retries = 60 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.retry_delay = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.send_service_user_token = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.split_loggers = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.verify_ssl = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican.verify_ssl_path = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican_service_user.auth_section = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican_service_user.auth_type = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican_service_user.cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican_service_user.certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican_service_user.collect_timing = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican_service_user.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican_service_user.keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican_service_user.split_loggers = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] barbican_service_user.timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.approle_role_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.approle_secret_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.collect_timing = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.kv_mountpoint = secret {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.kv_version = 2 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.namespace = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.root_token_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.split_loggers = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.ssl_ca_crt_file = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.use_ssl = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.collect_timing = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.connect_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.connect_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.endpoint_override = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.max_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.min_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.region_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.service_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.service_type = identity {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.split_loggers = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.status_code_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.status_code_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] keystone.version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.connection_uri = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.cpu_mode = custom {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.cpu_model_extra_flags = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: WARNING oslo_config.cfg [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] Deprecated: Option "cpu_model" from group "libvirt" is deprecated. Use option "cpu_models" from group "libvirt". Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.cpu_models = ['Nehalem'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.cpu_power_governor_high = performance {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.cpu_power_governor_low = powersave {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.cpu_power_management = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.device_detach_attempts = 8 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.device_detach_timeout = 20 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.disk_cachemodes = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.disk_prefix = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.enabled_perf_events = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.file_backed_memory = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.gid_maps = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.hw_disk_discard = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.hw_machine_type = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.images_rbd_ceph_conf = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.images_rbd_glance_store_name = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.images_rbd_pool = rbd {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.images_type = default {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.images_volume_group = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.inject_key = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.inject_partition = -2 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.inject_password = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.iscsi_iface = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.iser_use_multipath = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.live_migration_bandwidth = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.live_migration_completion_timeout = 800 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.live_migration_downtime = 500 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.live_migration_downtime_delay = 75 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.live_migration_downtime_steps = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.live_migration_inbound_addr = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.live_migration_permit_auto_converge = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.live_migration_permit_post_copy = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.live_migration_scheme = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.live_migration_timeout_action = abort {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.live_migration_tunnelled = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: WARNING oslo_config.cfg [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Apr 20 10:24:09 user nova-compute[71283]: live_migration_uri is deprecated for removal in favor of two other options that Apr 20 10:24:09 user nova-compute[71283]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Apr 20 10:24:09 user nova-compute[71283]: and ``live_migration_inbound_addr`` respectively. Apr 20 10:24:09 user nova-compute[71283]: ). Its value may be silently ignored in the future. Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.live_migration_uri = qemu+ssh://stack@%s/system {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.live_migration_with_native_tls = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.max_queues = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.mem_stats_period_seconds = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.nfs_mount_options = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.num_aoe_discover_tries = 3 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.num_iser_scan_tries = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.num_memory_encrypted_guests = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.num_nvme_discover_tries = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.num_pcie_ports = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.num_volume_scan_tries = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.pmem_namespaces = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.quobyte_client_cfg = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.rbd_connect_timeout = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.rbd_secret_uuid = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.rbd_user = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.realtime_scheduler_priority = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.remote_filesystem_transport = ssh {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.rescue_image_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.rescue_kernel_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.rescue_ramdisk_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.rng_dev_path = /dev/urandom {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.rx_queue_size = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.smbfs_mount_options = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.snapshot_compression = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.snapshot_image_format = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.sparse_logical_volumes = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.swtpm_enabled = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.swtpm_group = tss {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.swtpm_user = tss {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.sysinfo_serial = unique {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.tx_queue_size = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.uid_maps = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.use_virtio_for_bridges = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.virt_type = kvm {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.volume_clear = zero {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.volume_clear_size = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.volume_use_multipath = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.vzstorage_cache_path = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.vzstorage_mount_group = qemu {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.vzstorage_mount_opts = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.vzstorage_mount_user = stack {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.auth_section = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.auth_type = password {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.collect_timing = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.connect_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.connect_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.default_floating_pool = public {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.endpoint_override = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.extension_sync_interval = 600 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.http_retries = 3 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.max_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.metadata_proxy_shared_secret = **** {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.min_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.ovs_bridge = br-int {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.physnets = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.region_name = RegionOne {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.service_metadata_proxy = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.service_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.service_type = network {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.split_loggers = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.status_code_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.status_code_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] neutron.version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] notifications.bdms_in_notifications = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] notifications.default_level = INFO {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] notifications.notification_format = unversioned {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] notifications.notify_on_state_change = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] pci.alias = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] pci.device_spec = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] pci.report_in_placement = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.auth_section = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.auth_type = password {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.auth_url = http://10.0.0.210/identity {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.collect_timing = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.connect_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.connect_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.default_domain_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.default_domain_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.domain_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.domain_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.endpoint_override = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.max_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.min_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.password = **** {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.project_domain_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.project_domain_name = Default {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.project_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.project_name = service {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.region_name = RegionOne {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.service_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.service_type = placement {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.split_loggers = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.status_code_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.status_code_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.system_scope = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.trust_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.user_domain_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.user_domain_name = Default {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.user_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.username = placement {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] placement.version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] quota.cores = 20 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] quota.count_usage_from_placement = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] quota.injected_file_content_bytes = 10240 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] quota.injected_file_path_length = 255 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] quota.injected_files = 5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] quota.instances = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] quota.key_pairs = 100 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] quota.metadata_items = 128 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] quota.ram = 51200 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] quota.recheck_quota = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] quota.server_group_members = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] quota.server_groups = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] rdp.enabled = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] scheduler.image_metadata_prefilter = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] scheduler.max_attempts = 3 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] scheduler.max_placement_results = 1000 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] scheduler.query_placement_for_availability_zone = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] scheduler.query_placement_for_image_type_support = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] scheduler.workers = 3 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.host_subset_size = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.image_properties_default_architecture = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.isolated_hosts = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.isolated_images = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.max_instances_per_host = 50 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.pci_in_placement = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.track_instance_changes = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] metrics.required = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] metrics.weight_multiplier = 1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] metrics.weight_of_unavailable = -10000.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] metrics.weight_setting = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] serial_console.enabled = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] serial_console.port_range = 10000:20000 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] serial_console.serialproxy_port = 6083 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] service_user.auth_section = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] service_user.auth_type = password {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] service_user.cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] service_user.certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] service_user.collect_timing = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] service_user.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] service_user.keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] service_user.send_service_user_token = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] service_user.split_loggers = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] service_user.timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] spice.agent_enabled = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] spice.enabled = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] spice.html5proxy_base_url = http://10.0.0.210:6081/spice_auto.html {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] spice.html5proxy_host = 0.0.0.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] spice.html5proxy_port = 6082 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] spice.image_compression = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] spice.jpeg_compression = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] spice.playback_compression = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] spice.server_listen = 127.0.0.1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] spice.streaming_mode = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] spice.zlib_compression = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] upgrade_levels.baseapi = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] upgrade_levels.cert = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] upgrade_levels.compute = auto {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] upgrade_levels.conductor = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] upgrade_levels.scheduler = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vendordata_dynamic_auth.auth_section = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vendordata_dynamic_auth.auth_type = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vendordata_dynamic_auth.cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vendordata_dynamic_auth.certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vendordata_dynamic_auth.collect_timing = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vendordata_dynamic_auth.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vendordata_dynamic_auth.keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vendordata_dynamic_auth.split_loggers = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vendordata_dynamic_auth.timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.api_retry_count = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.ca_file = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.cache_prefix = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.cluster_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.connection_pool_size = 10 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.console_delay_seconds = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.datastore_regex = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.host_ip = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.host_password = **** {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.host_port = 443 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.host_username = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.integration_bridge = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.maximum_objects = 100 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.pbm_default_policy = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.pbm_enabled = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.pbm_wsdl_location = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.serial_port_proxy_uri = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.serial_port_service_uri = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.task_poll_interval = 0.5 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.use_linked_clone = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.vnc_keymap = en-us {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.vnc_port = 5900 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vmware.vnc_port_total = 10000 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vnc.auth_schemes = ['none'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vnc.enabled = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vnc.novncproxy_base_url = http://10.0.0.210:6080/vnc_lite.html {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vnc.novncproxy_port = 6080 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vnc.server_listen = 0.0.0.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vnc.server_proxyclient_address = 10.0.0.210 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vnc.vencrypt_ca_certs = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vnc.vencrypt_client_cert = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vnc.vencrypt_client_key = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.disable_fallback_pcpu_query = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.disable_group_policy_check_upcall = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.disable_rootwrap = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.enable_numa_live_migration = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.handle_virt_lifecycle_events = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.libvirt_disable_apic = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.never_download_image_if_on_rbd = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] wsgi.client_socket_timeout = 900 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] wsgi.default_pool_size = 1000 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] wsgi.keep_alive = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] wsgi.max_header_line = 16384 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] wsgi.secure_proxy_ssl_header = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] wsgi.ssl_ca_file = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] wsgi.ssl_cert_file = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] wsgi.ssl_key_file = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] wsgi.tcp_keepidle = 600 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] zvm.ca_file = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] zvm.cloud_connector_url = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] zvm.reachable_timeout = 300 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_policy.enforce_new_defaults = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_policy.enforce_scope = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_policy.policy_default_rule = default {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_policy.policy_file = policy.yaml {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] profiler.connection_string = messaging:// {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] profiler.enabled = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] profiler.es_doc_type = notification {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] profiler.es_scroll_size = 10000 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] profiler.es_scroll_time = 2m {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] profiler.filter_error_trace = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] profiler.hmac_keys = SECRET_KEY {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] profiler.sentinel_service_name = mymaster {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] profiler.socket_timeout = 0.1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] profiler.trace_sqlalchemy = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] remote_debug.host = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] remote_debug.port = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_bytes = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_length = 0 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.ssl = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_rabbit.ssl_version = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_notifications.retry = -1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_messaging_notifications.transport_url = **** {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.auth_section = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.auth_type = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.cafile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.certfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.collect_timing = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.connect_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.connect_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.endpoint_id = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.endpoint_override = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.insecure = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.keyfile = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.max_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.min_version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.region_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.service_name = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.service_type = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.split_loggers = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.status_code_retries = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.status_code_retry_delay = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.timeout = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.valid_interfaces = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_limit.version = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_reports.file_event_handler = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_reports.file_event_handler_interval = 1 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] oslo_reports.log_dir = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plug_linux_bridge_privileged.group = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plug_linux_bridge_privileged.thread_pool_size = 12 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plug_linux_bridge_privileged.user = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plug_ovs_privileged.group = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plug_ovs_privileged.helper_command = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plug_ovs_privileged.thread_pool_size = 12 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] vif_plug_ovs_privileged.user = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_linux_bridge.flat_interface = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_linux_bridge.vlan_interface = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_ovs.isolate_vif = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_ovs.ovsdb_interface = native {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_vif_ovs.per_port_bridge = False {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] os_brick.lock_path = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] privsep_osbrick.capabilities = [21] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] privsep_osbrick.group = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] privsep_osbrick.helper_command = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] privsep_osbrick.thread_pool_size = 12 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] privsep_osbrick.user = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] nova_sys_admin.group = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] nova_sys_admin.helper_command = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] nova_sys_admin.thread_pool_size = 12 {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] nova_sys_admin.user = None {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG oslo_service.service [None req-18b94b28-eaf8-4b31-9167-2f8260109cfa None None] ******************************************************************************** {{(pid=71283) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} Apr 20 10:24:09 user nova-compute[71283]: INFO nova.service [-] Starting compute node (version 0.0.0) Apr 20 10:24:09 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Starting native event thread {{(pid=71283) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:492}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Starting green dispatch thread {{(pid=71283) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:498}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Starting connection event dispatch thread {{(pid=71283) initialize /opt/stack/nova/nova/virt/libvirt/host.py:620}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Connecting to libvirt: qemu:///system {{(pid=71283) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:503}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Registering for lifecycle events {{(pid=71283) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:509}} Apr 20 10:24:09 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Registering for connection events: {{(pid=71283) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:530}} Apr 20 10:24:09 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Connection event '1' reason 'None' Apr 20 10:24:09 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Cannot update service status on host "user" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 20 10:24:09 user nova-compute[71283]: DEBUG nova.virt.libvirt.volume.mount [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Initialising _HostMountState generation 0 {{(pid=71283) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Apr 20 10:24:16 user nova-compute[71283]: INFO nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Libvirt host capabilities Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: e20c3142-5af9-7467-ecd8-70b2e4a210d6 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: x86_64 Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: Intel Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: tcp Apr 20 10:24:16 user nova-compute[71283]: rdma Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 8152920 Apr 20 10:24:16 user nova-compute[71283]: 2038230 Apr 20 10:24:16 user nova-compute[71283]: 0 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 8255068 Apr 20 10:24:16 user nova-compute[71283]: 2063767 Apr 20 10:24:16 user nova-compute[71283]: 0 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: apparmor Apr 20 10:24:16 user nova-compute[71283]: 0 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: dac Apr 20 10:24:16 user nova-compute[71283]: 0 Apr 20 10:24:16 user nova-compute[71283]: +64055:+108 Apr 20 10:24:16 user nova-compute[71283]: +64055:+108 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 64 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-alpha Apr 20 10:24:16 user nova-compute[71283]: clipper Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-arm Apr 20 10:24:16 user nova-compute[71283]: integratorcp Apr 20 10:24:16 user nova-compute[71283]: ast2600-evb Apr 20 10:24:16 user nova-compute[71283]: borzoi Apr 20 10:24:16 user nova-compute[71283]: spitz Apr 20 10:24:16 user nova-compute[71283]: virt-2.7 Apr 20 10:24:16 user nova-compute[71283]: nuri Apr 20 10:24:16 user nova-compute[71283]: mcimx7d-sabre Apr 20 10:24:16 user nova-compute[71283]: romulus-bmc Apr 20 10:24:16 user nova-compute[71283]: virt-3.0 Apr 20 10:24:16 user nova-compute[71283]: virt-5.0 Apr 20 10:24:16 user nova-compute[71283]: npcm750-evb Apr 20 10:24:16 user nova-compute[71283]: virt-2.10 Apr 20 10:24:16 user nova-compute[71283]: rainier-bmc Apr 20 10:24:16 user nova-compute[71283]: mps3-an547 Apr 20 10:24:16 user nova-compute[71283]: musca-b1 Apr 20 10:24:16 user nova-compute[71283]: realview-pbx-a9 Apr 20 10:24:16 user nova-compute[71283]: versatileab Apr 20 10:24:16 user nova-compute[71283]: kzm Apr 20 10:24:16 user nova-compute[71283]: virt-2.8 Apr 20 10:24:16 user nova-compute[71283]: musca-a Apr 20 10:24:16 user nova-compute[71283]: virt-3.1 Apr 20 10:24:16 user nova-compute[71283]: mcimx6ul-evk Apr 20 10:24:16 user nova-compute[71283]: virt-5.1 Apr 20 10:24:16 user nova-compute[71283]: smdkc210 Apr 20 10:24:16 user nova-compute[71283]: sx1 Apr 20 10:24:16 user nova-compute[71283]: virt-2.11 Apr 20 10:24:16 user nova-compute[71283]: imx25-pdk Apr 20 10:24:16 user nova-compute[71283]: stm32vldiscovery Apr 20 10:24:16 user nova-compute[71283]: virt-2.9 Apr 20 10:24:16 user nova-compute[71283]: orangepi-pc Apr 20 10:24:16 user nova-compute[71283]: quanta-q71l-bmc Apr 20 10:24:16 user nova-compute[71283]: z2 Apr 20 10:24:16 user nova-compute[71283]: virt-5.2 Apr 20 10:24:16 user nova-compute[71283]: xilinx-zynq-a9 Apr 20 10:24:16 user nova-compute[71283]: tosa Apr 20 10:24:16 user nova-compute[71283]: mps2-an500 Apr 20 10:24:16 user nova-compute[71283]: virt-2.12 Apr 20 10:24:16 user nova-compute[71283]: mps2-an521 Apr 20 10:24:16 user nova-compute[71283]: sabrelite Apr 20 10:24:16 user nova-compute[71283]: mps2-an511 Apr 20 10:24:16 user nova-compute[71283]: canon-a1100 Apr 20 10:24:16 user nova-compute[71283]: realview-eb Apr 20 10:24:16 user nova-compute[71283]: quanta-gbs-bmc Apr 20 10:24:16 user nova-compute[71283]: emcraft-sf2 Apr 20 10:24:16 user nova-compute[71283]: realview-pb-a8 Apr 20 10:24:16 user nova-compute[71283]: virt-4.0 Apr 20 10:24:16 user nova-compute[71283]: raspi1ap Apr 20 10:24:16 user nova-compute[71283]: palmetto-bmc Apr 20 10:24:16 user nova-compute[71283]: sx1-v1 Apr 20 10:24:16 user nova-compute[71283]: n810 Apr 20 10:24:16 user nova-compute[71283]: g220a-bmc Apr 20 10:24:16 user nova-compute[71283]: n800 Apr 20 10:24:16 user nova-compute[71283]: tacoma-bmc Apr 20 10:24:16 user nova-compute[71283]: virt-4.1 Apr 20 10:24:16 user nova-compute[71283]: quanta-gsj Apr 20 10:24:16 user nova-compute[71283]: versatilepb Apr 20 10:24:16 user nova-compute[71283]: terrier Apr 20 10:24:16 user nova-compute[71283]: mainstone Apr 20 10:24:16 user nova-compute[71283]: realview-eb-mpcore Apr 20 10:24:16 user nova-compute[71283]: supermicrox11-bmc Apr 20 10:24:16 user nova-compute[71283]: virt-4.2 Apr 20 10:24:16 user nova-compute[71283]: witherspoon-bmc Apr 20 10:24:16 user nova-compute[71283]: mps3-an524 Apr 20 10:24:16 user nova-compute[71283]: swift-bmc Apr 20 10:24:16 user nova-compute[71283]: kudo-bmc Apr 20 10:24:16 user nova-compute[71283]: vexpress-a9 Apr 20 10:24:16 user nova-compute[71283]: midway Apr 20 10:24:16 user nova-compute[71283]: musicpal Apr 20 10:24:16 user nova-compute[71283]: lm3s811evb Apr 20 10:24:16 user nova-compute[71283]: lm3s6965evb Apr 20 10:24:16 user nova-compute[71283]: microbit Apr 20 10:24:16 user nova-compute[71283]: mps2-an505 Apr 20 10:24:16 user nova-compute[71283]: mps2-an385 Apr 20 10:24:16 user nova-compute[71283]: virt-6.0 Apr 20 10:24:16 user nova-compute[71283]: cubieboard Apr 20 10:24:16 user nova-compute[71283]: verdex Apr 20 10:24:16 user nova-compute[71283]: netduino2 Apr 20 10:24:16 user nova-compute[71283]: mps2-an386 Apr 20 10:24:16 user nova-compute[71283]: virt-6.1 Apr 20 10:24:16 user nova-compute[71283]: raspi2b Apr 20 10:24:16 user nova-compute[71283]: vexpress-a15 Apr 20 10:24:16 user nova-compute[71283]: fuji-bmc Apr 20 10:24:16 user nova-compute[71283]: virt-6.2 Apr 20 10:24:16 user nova-compute[71283]: virt Apr 20 10:24:16 user nova-compute[71283]: sonorapass-bmc Apr 20 10:24:16 user nova-compute[71283]: cheetah Apr 20 10:24:16 user nova-compute[71283]: virt-2.6 Apr 20 10:24:16 user nova-compute[71283]: ast2500-evb Apr 20 10:24:16 user nova-compute[71283]: highbank Apr 20 10:24:16 user nova-compute[71283]: akita Apr 20 10:24:16 user nova-compute[71283]: connex Apr 20 10:24:16 user nova-compute[71283]: netduinoplus2 Apr 20 10:24:16 user nova-compute[71283]: collie Apr 20 10:24:16 user nova-compute[71283]: raspi0 Apr 20 10:24:16 user nova-compute[71283]: fp5280g2-bmc Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-arm Apr 20 10:24:16 user nova-compute[71283]: integratorcp Apr 20 10:24:16 user nova-compute[71283]: ast2600-evb Apr 20 10:24:16 user nova-compute[71283]: borzoi Apr 20 10:24:16 user nova-compute[71283]: spitz Apr 20 10:24:16 user nova-compute[71283]: virt-2.7 Apr 20 10:24:16 user nova-compute[71283]: nuri Apr 20 10:24:16 user nova-compute[71283]: mcimx7d-sabre Apr 20 10:24:16 user nova-compute[71283]: romulus-bmc Apr 20 10:24:16 user nova-compute[71283]: virt-3.0 Apr 20 10:24:16 user nova-compute[71283]: virt-5.0 Apr 20 10:24:16 user nova-compute[71283]: npcm750-evb Apr 20 10:24:16 user nova-compute[71283]: virt-2.10 Apr 20 10:24:16 user nova-compute[71283]: rainier-bmc Apr 20 10:24:16 user nova-compute[71283]: mps3-an547 Apr 20 10:24:16 user nova-compute[71283]: musca-b1 Apr 20 10:24:16 user nova-compute[71283]: realview-pbx-a9 Apr 20 10:24:16 user nova-compute[71283]: versatileab Apr 20 10:24:16 user nova-compute[71283]: kzm Apr 20 10:24:16 user nova-compute[71283]: virt-2.8 Apr 20 10:24:16 user nova-compute[71283]: musca-a Apr 20 10:24:16 user nova-compute[71283]: virt-3.1 Apr 20 10:24:16 user nova-compute[71283]: mcimx6ul-evk Apr 20 10:24:16 user nova-compute[71283]: virt-5.1 Apr 20 10:24:16 user nova-compute[71283]: smdkc210 Apr 20 10:24:16 user nova-compute[71283]: sx1 Apr 20 10:24:16 user nova-compute[71283]: virt-2.11 Apr 20 10:24:16 user nova-compute[71283]: imx25-pdk Apr 20 10:24:16 user nova-compute[71283]: stm32vldiscovery Apr 20 10:24:16 user nova-compute[71283]: virt-2.9 Apr 20 10:24:16 user nova-compute[71283]: orangepi-pc Apr 20 10:24:16 user nova-compute[71283]: quanta-q71l-bmc Apr 20 10:24:16 user nova-compute[71283]: z2 Apr 20 10:24:16 user nova-compute[71283]: virt-5.2 Apr 20 10:24:16 user nova-compute[71283]: xilinx-zynq-a9 Apr 20 10:24:16 user nova-compute[71283]: tosa Apr 20 10:24:16 user nova-compute[71283]: mps2-an500 Apr 20 10:24:16 user nova-compute[71283]: virt-2.12 Apr 20 10:24:16 user nova-compute[71283]: mps2-an521 Apr 20 10:24:16 user nova-compute[71283]: sabrelite Apr 20 10:24:16 user nova-compute[71283]: mps2-an511 Apr 20 10:24:16 user nova-compute[71283]: canon-a1100 Apr 20 10:24:16 user nova-compute[71283]: realview-eb Apr 20 10:24:16 user nova-compute[71283]: quanta-gbs-bmc Apr 20 10:24:16 user nova-compute[71283]: emcraft-sf2 Apr 20 10:24:16 user nova-compute[71283]: realview-pb-a8 Apr 20 10:24:16 user nova-compute[71283]: virt-4.0 Apr 20 10:24:16 user nova-compute[71283]: raspi1ap Apr 20 10:24:16 user nova-compute[71283]: palmetto-bmc Apr 20 10:24:16 user nova-compute[71283]: sx1-v1 Apr 20 10:24:16 user nova-compute[71283]: n810 Apr 20 10:24:16 user nova-compute[71283]: g220a-bmc Apr 20 10:24:16 user nova-compute[71283]: n800 Apr 20 10:24:16 user nova-compute[71283]: tacoma-bmc Apr 20 10:24:16 user nova-compute[71283]: virt-4.1 Apr 20 10:24:16 user nova-compute[71283]: quanta-gsj Apr 20 10:24:16 user nova-compute[71283]: versatilepb Apr 20 10:24:16 user nova-compute[71283]: terrier Apr 20 10:24:16 user nova-compute[71283]: mainstone Apr 20 10:24:16 user nova-compute[71283]: realview-eb-mpcore Apr 20 10:24:16 user nova-compute[71283]: supermicrox11-bmc Apr 20 10:24:16 user nova-compute[71283]: virt-4.2 Apr 20 10:24:16 user nova-compute[71283]: witherspoon-bmc Apr 20 10:24:16 user nova-compute[71283]: mps3-an524 Apr 20 10:24:16 user nova-compute[71283]: swift-bmc Apr 20 10:24:16 user nova-compute[71283]: kudo-bmc Apr 20 10:24:16 user nova-compute[71283]: vexpress-a9 Apr 20 10:24:16 user nova-compute[71283]: midway Apr 20 10:24:16 user nova-compute[71283]: musicpal Apr 20 10:24:16 user nova-compute[71283]: lm3s811evb Apr 20 10:24:16 user nova-compute[71283]: lm3s6965evb Apr 20 10:24:16 user nova-compute[71283]: microbit Apr 20 10:24:16 user nova-compute[71283]: mps2-an505 Apr 20 10:24:16 user nova-compute[71283]: mps2-an385 Apr 20 10:24:16 user nova-compute[71283]: virt-6.0 Apr 20 10:24:16 user nova-compute[71283]: cubieboard Apr 20 10:24:16 user nova-compute[71283]: verdex Apr 20 10:24:16 user nova-compute[71283]: netduino2 Apr 20 10:24:16 user nova-compute[71283]: mps2-an386 Apr 20 10:24:16 user nova-compute[71283]: virt-6.1 Apr 20 10:24:16 user nova-compute[71283]: raspi2b Apr 20 10:24:16 user nova-compute[71283]: vexpress-a15 Apr 20 10:24:16 user nova-compute[71283]: fuji-bmc Apr 20 10:24:16 user nova-compute[71283]: virt-6.2 Apr 20 10:24:16 user nova-compute[71283]: virt Apr 20 10:24:16 user nova-compute[71283]: sonorapass-bmc Apr 20 10:24:16 user nova-compute[71283]: cheetah Apr 20 10:24:16 user nova-compute[71283]: virt-2.6 Apr 20 10:24:16 user nova-compute[71283]: ast2500-evb Apr 20 10:24:16 user nova-compute[71283]: highbank Apr 20 10:24:16 user nova-compute[71283]: akita Apr 20 10:24:16 user nova-compute[71283]: connex Apr 20 10:24:16 user nova-compute[71283]: netduinoplus2 Apr 20 10:24:16 user nova-compute[71283]: collie Apr 20 10:24:16 user nova-compute[71283]: raspi0 Apr 20 10:24:16 user nova-compute[71283]: fp5280g2-bmc Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 64 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-aarch64 Apr 20 10:24:16 user nova-compute[71283]: integratorcp Apr 20 10:24:16 user nova-compute[71283]: ast2600-evb Apr 20 10:24:16 user nova-compute[71283]: borzoi Apr 20 10:24:16 user nova-compute[71283]: spitz Apr 20 10:24:16 user nova-compute[71283]: virt-2.7 Apr 20 10:24:16 user nova-compute[71283]: nuri Apr 20 10:24:16 user nova-compute[71283]: mcimx7d-sabre Apr 20 10:24:16 user nova-compute[71283]: romulus-bmc Apr 20 10:24:16 user nova-compute[71283]: virt-3.0 Apr 20 10:24:16 user nova-compute[71283]: virt-5.0 Apr 20 10:24:16 user nova-compute[71283]: npcm750-evb Apr 20 10:24:16 user nova-compute[71283]: virt-2.10 Apr 20 10:24:16 user nova-compute[71283]: rainier-bmc Apr 20 10:24:16 user nova-compute[71283]: mps3-an547 Apr 20 10:24:16 user nova-compute[71283]: virt-2.8 Apr 20 10:24:16 user nova-compute[71283]: musca-b1 Apr 20 10:24:16 user nova-compute[71283]: realview-pbx-a9 Apr 20 10:24:16 user nova-compute[71283]: versatileab Apr 20 10:24:16 user nova-compute[71283]: kzm Apr 20 10:24:16 user nova-compute[71283]: musca-a Apr 20 10:24:16 user nova-compute[71283]: virt-3.1 Apr 20 10:24:16 user nova-compute[71283]: mcimx6ul-evk Apr 20 10:24:16 user nova-compute[71283]: virt-5.1 Apr 20 10:24:16 user nova-compute[71283]: smdkc210 Apr 20 10:24:16 user nova-compute[71283]: sx1 Apr 20 10:24:16 user nova-compute[71283]: virt-2.11 Apr 20 10:24:16 user nova-compute[71283]: imx25-pdk Apr 20 10:24:16 user nova-compute[71283]: stm32vldiscovery Apr 20 10:24:16 user nova-compute[71283]: virt-2.9 Apr 20 10:24:16 user nova-compute[71283]: orangepi-pc Apr 20 10:24:16 user nova-compute[71283]: quanta-q71l-bmc Apr 20 10:24:16 user nova-compute[71283]: z2 Apr 20 10:24:16 user nova-compute[71283]: virt-5.2 Apr 20 10:24:16 user nova-compute[71283]: xilinx-zynq-a9 Apr 20 10:24:16 user nova-compute[71283]: xlnx-zcu102 Apr 20 10:24:16 user nova-compute[71283]: tosa Apr 20 10:24:16 user nova-compute[71283]: mps2-an500 Apr 20 10:24:16 user nova-compute[71283]: virt-2.12 Apr 20 10:24:16 user nova-compute[71283]: mps2-an521 Apr 20 10:24:16 user nova-compute[71283]: sabrelite Apr 20 10:24:16 user nova-compute[71283]: mps2-an511 Apr 20 10:24:16 user nova-compute[71283]: canon-a1100 Apr 20 10:24:16 user nova-compute[71283]: realview-eb Apr 20 10:24:16 user nova-compute[71283]: quanta-gbs-bmc Apr 20 10:24:16 user nova-compute[71283]: emcraft-sf2 Apr 20 10:24:16 user nova-compute[71283]: realview-pb-a8 Apr 20 10:24:16 user nova-compute[71283]: sbsa-ref Apr 20 10:24:16 user nova-compute[71283]: virt-4.0 Apr 20 10:24:16 user nova-compute[71283]: raspi1ap Apr 20 10:24:16 user nova-compute[71283]: palmetto-bmc Apr 20 10:24:16 user nova-compute[71283]: sx1-v1 Apr 20 10:24:16 user nova-compute[71283]: n810 Apr 20 10:24:16 user nova-compute[71283]: g220a-bmc Apr 20 10:24:16 user nova-compute[71283]: n800 Apr 20 10:24:16 user nova-compute[71283]: tacoma-bmc Apr 20 10:24:16 user nova-compute[71283]: virt-4.1 Apr 20 10:24:16 user nova-compute[71283]: quanta-gsj Apr 20 10:24:16 user nova-compute[71283]: versatilepb Apr 20 10:24:16 user nova-compute[71283]: terrier Apr 20 10:24:16 user nova-compute[71283]: mainstone Apr 20 10:24:16 user nova-compute[71283]: realview-eb-mpcore Apr 20 10:24:16 user nova-compute[71283]: supermicrox11-bmc Apr 20 10:24:16 user nova-compute[71283]: virt-4.2 Apr 20 10:24:16 user nova-compute[71283]: witherspoon-bmc Apr 20 10:24:16 user nova-compute[71283]: mps3-an524 Apr 20 10:24:16 user nova-compute[71283]: swift-bmc Apr 20 10:24:16 user nova-compute[71283]: kudo-bmc Apr 20 10:24:16 user nova-compute[71283]: vexpress-a9 Apr 20 10:24:16 user nova-compute[71283]: midway Apr 20 10:24:16 user nova-compute[71283]: musicpal Apr 20 10:24:16 user nova-compute[71283]: lm3s811evb Apr 20 10:24:16 user nova-compute[71283]: lm3s6965evb Apr 20 10:24:16 user nova-compute[71283]: microbit Apr 20 10:24:16 user nova-compute[71283]: mps2-an505 Apr 20 10:24:16 user nova-compute[71283]: mps2-an385 Apr 20 10:24:16 user nova-compute[71283]: virt-6.0 Apr 20 10:24:16 user nova-compute[71283]: raspi3ap Apr 20 10:24:16 user nova-compute[71283]: cubieboard Apr 20 10:24:16 user nova-compute[71283]: verdex Apr 20 10:24:16 user nova-compute[71283]: netduino2 Apr 20 10:24:16 user nova-compute[71283]: xlnx-versal-virt Apr 20 10:24:16 user nova-compute[71283]: mps2-an386 Apr 20 10:24:16 user nova-compute[71283]: virt-6.1 Apr 20 10:24:16 user nova-compute[71283]: raspi3b Apr 20 10:24:16 user nova-compute[71283]: raspi2b Apr 20 10:24:16 user nova-compute[71283]: vexpress-a15 Apr 20 10:24:16 user nova-compute[71283]: fuji-bmc Apr 20 10:24:16 user nova-compute[71283]: virt-6.2 Apr 20 10:24:16 user nova-compute[71283]: virt Apr 20 10:24:16 user nova-compute[71283]: sonorapass-bmc Apr 20 10:24:16 user nova-compute[71283]: cheetah Apr 20 10:24:16 user nova-compute[71283]: virt-2.6 Apr 20 10:24:16 user nova-compute[71283]: ast2500-evb Apr 20 10:24:16 user nova-compute[71283]: highbank Apr 20 10:24:16 user nova-compute[71283]: akita Apr 20 10:24:16 user nova-compute[71283]: connex Apr 20 10:24:16 user nova-compute[71283]: netduinoplus2 Apr 20 10:24:16 user nova-compute[71283]: collie Apr 20 10:24:16 user nova-compute[71283]: raspi0 Apr 20 10:24:16 user nova-compute[71283]: fp5280g2-bmc Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-cris Apr 20 10:24:16 user nova-compute[71283]: axis-dev88 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-i386 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-jammy Apr 20 10:24:16 user nova-compute[71283]: ubuntu Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-impish-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-q35-5.2 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.12 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.0 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-xenial Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-6.2 Apr 20 10:24:16 user nova-compute[71283]: pc Apr 20 10:24:16 user nova-compute[71283]: pc-q35-4.2 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.5 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-4.2 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-focal Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-hirsute Apr 20 10:24:16 user nova-compute[71283]: pc-q35-xenial Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-jammy-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-5.2 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-1.5 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.7 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-eoan-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-zesty Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-disco-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-q35-groovy Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-groovy Apr 20 10:24:16 user nova-compute[71283]: pc-q35-artful Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.2 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-trusty Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-eoan-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-q35-focal-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-q35-bionic-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-artful Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.7 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-6.1 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-yakkety Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.4 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-cosmic-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.10 Apr 20 10:24:16 user nova-compute[71283]: x-remote Apr 20 10:24:16 user nova-compute[71283]: pc-q35-5.1 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-1.7 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.9 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.11 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-3.1 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-6.1 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-4.1 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-jammy Apr 20 10:24:16 user nova-compute[71283]: ubuntu-q35 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.4 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-4.1 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-eoan Apr 20 10:24:16 user nova-compute[71283]: pc-q35-jammy-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-5.1 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.9 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-bionic-hpb Apr 20 10:24:16 user nova-compute[71283]: isapc Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-1.4 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-cosmic Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.6 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-3.1 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-bionic Apr 20 10:24:16 user nova-compute[71283]: pc-q35-disco-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-cosmic Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.12 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-bionic Apr 20 10:24:16 user nova-compute[71283]: pc-q35-groovy-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-q35-disco Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-cosmic-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.1 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-wily Apr 20 10:24:16 user nova-compute[71283]: pc-q35-impish Apr 20 10:24:16 user nova-compute[71283]: pc-q35-6.0 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-impish Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.6 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-impish-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-q35-hirsute Apr 20 10:24:16 user nova-compute[71283]: pc-q35-4.0.1 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-hirsute-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-1.6 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-5.0 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.8 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.10 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-3.0 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-6.0 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-zesty Apr 20 10:24:16 user nova-compute[71283]: pc-q35-4.0 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-focal Apr 20 10:24:16 user nova-compute[71283]: microvm Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.3 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-focal-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-disco Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-4.0 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-groovy-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-hirsute-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-5.0 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-6.2 Apr 20 10:24:16 user nova-compute[71283]: q35 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.8 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-eoan Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.5 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-3.0 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-yakkety Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.11 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-m68k Apr 20 10:24:16 user nova-compute[71283]: mcf5208evb Apr 20 10:24:16 user nova-compute[71283]: an5206 Apr 20 10:24:16 user nova-compute[71283]: virt-6.0 Apr 20 10:24:16 user nova-compute[71283]: q800 Apr 20 10:24:16 user nova-compute[71283]: virt-6.2 Apr 20 10:24:16 user nova-compute[71283]: virt Apr 20 10:24:16 user nova-compute[71283]: next-cube Apr 20 10:24:16 user nova-compute[71283]: virt-6.1 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-microblaze Apr 20 10:24:16 user nova-compute[71283]: petalogix-s3adsp1800 Apr 20 10:24:16 user nova-compute[71283]: petalogix-ml605 Apr 20 10:24:16 user nova-compute[71283]: xlnx-zynqmp-pmu Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-microblazeel Apr 20 10:24:16 user nova-compute[71283]: petalogix-s3adsp1800 Apr 20 10:24:16 user nova-compute[71283]: petalogix-ml605 Apr 20 10:24:16 user nova-compute[71283]: xlnx-zynqmp-pmu Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-mips Apr 20 10:24:16 user nova-compute[71283]: malta Apr 20 10:24:16 user nova-compute[71283]: mipssim Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-mipsel Apr 20 10:24:16 user nova-compute[71283]: malta Apr 20 10:24:16 user nova-compute[71283]: mipssim Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 64 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-mips64 Apr 20 10:24:16 user nova-compute[71283]: malta Apr 20 10:24:16 user nova-compute[71283]: mipssim Apr 20 10:24:16 user nova-compute[71283]: pica61 Apr 20 10:24:16 user nova-compute[71283]: magnum Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 64 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-mips64el Apr 20 10:24:16 user nova-compute[71283]: malta Apr 20 10:24:16 user nova-compute[71283]: loongson3-virt Apr 20 10:24:16 user nova-compute[71283]: mipssim Apr 20 10:24:16 user nova-compute[71283]: pica61 Apr 20 10:24:16 user nova-compute[71283]: magnum Apr 20 10:24:16 user nova-compute[71283]: boston Apr 20 10:24:16 user nova-compute[71283]: fuloong2e Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-ppc Apr 20 10:24:16 user nova-compute[71283]: g3beige Apr 20 10:24:16 user nova-compute[71283]: virtex-ml507 Apr 20 10:24:16 user nova-compute[71283]: mac99 Apr 20 10:24:16 user nova-compute[71283]: ppce500 Apr 20 10:24:16 user nova-compute[71283]: pegasos2 Apr 20 10:24:16 user nova-compute[71283]: sam460ex Apr 20 10:24:16 user nova-compute[71283]: bamboo Apr 20 10:24:16 user nova-compute[71283]: 40p Apr 20 10:24:16 user nova-compute[71283]: ref405ep Apr 20 10:24:16 user nova-compute[71283]: mpc8544ds Apr 20 10:24:16 user nova-compute[71283]: taihu Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 64 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-ppc64 Apr 20 10:24:16 user nova-compute[71283]: pseries-jammy Apr 20 10:24:16 user nova-compute[71283]: pseries Apr 20 10:24:16 user nova-compute[71283]: powernv9 Apr 20 10:24:16 user nova-compute[71283]: powernv Apr 20 10:24:16 user nova-compute[71283]: taihu Apr 20 10:24:16 user nova-compute[71283]: pseries-4.1 Apr 20 10:24:16 user nova-compute[71283]: mpc8544ds Apr 20 10:24:16 user nova-compute[71283]: pseries-6.1 Apr 20 10:24:16 user nova-compute[71283]: pseries-2.5 Apr 20 10:24:16 user nova-compute[71283]: powernv10 Apr 20 10:24:16 user nova-compute[71283]: pseries-xenial Apr 20 10:24:16 user nova-compute[71283]: pseries-4.2 Apr 20 10:24:16 user nova-compute[71283]: pseries-6.2 Apr 20 10:24:16 user nova-compute[71283]: pseries-yakkety Apr 20 10:24:16 user nova-compute[71283]: pseries-2.6 Apr 20 10:24:16 user nova-compute[71283]: ppce500 Apr 20 10:24:16 user nova-compute[71283]: pseries-bionic-sxxm Apr 20 10:24:16 user nova-compute[71283]: pseries-2.7 Apr 20 10:24:16 user nova-compute[71283]: pseries-3.0 Apr 20 10:24:16 user nova-compute[71283]: pseries-5.0 Apr 20 10:24:16 user nova-compute[71283]: 40p Apr 20 10:24:16 user nova-compute[71283]: pseries-2.8 Apr 20 10:24:16 user nova-compute[71283]: pegasos2 Apr 20 10:24:16 user nova-compute[71283]: pseries-hirsute Apr 20 10:24:16 user nova-compute[71283]: pseries-3.1 Apr 20 10:24:16 user nova-compute[71283]: pseries-5.1 Apr 20 10:24:16 user nova-compute[71283]: pseries-eoan Apr 20 10:24:16 user nova-compute[71283]: pseries-2.9 Apr 20 10:24:16 user nova-compute[71283]: pseries-zesty Apr 20 10:24:16 user nova-compute[71283]: bamboo Apr 20 10:24:16 user nova-compute[71283]: pseries-groovy Apr 20 10:24:16 user nova-compute[71283]: pseries-focal Apr 20 10:24:16 user nova-compute[71283]: g3beige Apr 20 10:24:16 user nova-compute[71283]: pseries-5.2 Apr 20 10:24:16 user nova-compute[71283]: pseries-disco Apr 20 10:24:16 user nova-compute[71283]: pseries-2.12-sxxm Apr 20 10:24:16 user nova-compute[71283]: pseries-2.10 Apr 20 10:24:16 user nova-compute[71283]: virtex-ml507 Apr 20 10:24:16 user nova-compute[71283]: pseries-2.11 Apr 20 10:24:16 user nova-compute[71283]: pseries-2.1 Apr 20 10:24:16 user nova-compute[71283]: pseries-cosmic Apr 20 10:24:16 user nova-compute[71283]: pseries-bionic Apr 20 10:24:16 user nova-compute[71283]: pseries-2.12 Apr 20 10:24:16 user nova-compute[71283]: pseries-2.2 Apr 20 10:24:16 user nova-compute[71283]: mac99 Apr 20 10:24:16 user nova-compute[71283]: pseries-impish Apr 20 10:24:16 user nova-compute[71283]: pseries-artful Apr 20 10:24:16 user nova-compute[71283]: sam460ex Apr 20 10:24:16 user nova-compute[71283]: ref405ep Apr 20 10:24:16 user nova-compute[71283]: pseries-2.3 Apr 20 10:24:16 user nova-compute[71283]: powernv8 Apr 20 10:24:16 user nova-compute[71283]: pseries-4.0 Apr 20 10:24:16 user nova-compute[71283]: pseries-6.0 Apr 20 10:24:16 user nova-compute[71283]: pseries-2.4 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 64 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-ppc64le Apr 20 10:24:16 user nova-compute[71283]: pseries-jammy Apr 20 10:24:16 user nova-compute[71283]: pseries Apr 20 10:24:16 user nova-compute[71283]: powernv9 Apr 20 10:24:16 user nova-compute[71283]: powernv Apr 20 10:24:16 user nova-compute[71283]: taihu Apr 20 10:24:16 user nova-compute[71283]: pseries-4.1 Apr 20 10:24:16 user nova-compute[71283]: mpc8544ds Apr 20 10:24:16 user nova-compute[71283]: pseries-6.1 Apr 20 10:24:16 user nova-compute[71283]: pseries-2.5 Apr 20 10:24:16 user nova-compute[71283]: powernv10 Apr 20 10:24:16 user nova-compute[71283]: pseries-xenial Apr 20 10:24:16 user nova-compute[71283]: pseries-4.2 Apr 20 10:24:16 user nova-compute[71283]: pseries-6.2 Apr 20 10:24:16 user nova-compute[71283]: pseries-yakkety Apr 20 10:24:16 user nova-compute[71283]: pseries-2.6 Apr 20 10:24:16 user nova-compute[71283]: ppce500 Apr 20 10:24:16 user nova-compute[71283]: pseries-bionic-sxxm Apr 20 10:24:16 user nova-compute[71283]: pseries-2.7 Apr 20 10:24:16 user nova-compute[71283]: pseries-3.0 Apr 20 10:24:16 user nova-compute[71283]: pseries-5.0 Apr 20 10:24:16 user nova-compute[71283]: 40p Apr 20 10:24:16 user nova-compute[71283]: pseries-2.8 Apr 20 10:24:16 user nova-compute[71283]: pegasos2 Apr 20 10:24:16 user nova-compute[71283]: pseries-hirsute Apr 20 10:24:16 user nova-compute[71283]: pseries-3.1 Apr 20 10:24:16 user nova-compute[71283]: pseries-5.1 Apr 20 10:24:16 user nova-compute[71283]: pseries-eoan Apr 20 10:24:16 user nova-compute[71283]: pseries-2.9 Apr 20 10:24:16 user nova-compute[71283]: pseries-zesty Apr 20 10:24:16 user nova-compute[71283]: bamboo Apr 20 10:24:16 user nova-compute[71283]: pseries-groovy Apr 20 10:24:16 user nova-compute[71283]: pseries-focal Apr 20 10:24:16 user nova-compute[71283]: g3beige Apr 20 10:24:16 user nova-compute[71283]: pseries-5.2 Apr 20 10:24:16 user nova-compute[71283]: pseries-disco Apr 20 10:24:16 user nova-compute[71283]: pseries-2.12-sxxm Apr 20 10:24:16 user nova-compute[71283]: pseries-2.10 Apr 20 10:24:16 user nova-compute[71283]: virtex-ml507 Apr 20 10:24:16 user nova-compute[71283]: pseries-2.11 Apr 20 10:24:16 user nova-compute[71283]: pseries-2.1 Apr 20 10:24:16 user nova-compute[71283]: pseries-cosmic Apr 20 10:24:16 user nova-compute[71283]: pseries-bionic Apr 20 10:24:16 user nova-compute[71283]: pseries-2.12 Apr 20 10:24:16 user nova-compute[71283]: pseries-2.2 Apr 20 10:24:16 user nova-compute[71283]: mac99 Apr 20 10:24:16 user nova-compute[71283]: pseries-impish Apr 20 10:24:16 user nova-compute[71283]: pseries-artful Apr 20 10:24:16 user nova-compute[71283]: sam460ex Apr 20 10:24:16 user nova-compute[71283]: ref405ep Apr 20 10:24:16 user nova-compute[71283]: pseries-2.3 Apr 20 10:24:16 user nova-compute[71283]: powernv8 Apr 20 10:24:16 user nova-compute[71283]: pseries-4.0 Apr 20 10:24:16 user nova-compute[71283]: pseries-6.0 Apr 20 10:24:16 user nova-compute[71283]: pseries-2.4 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-riscv32 Apr 20 10:24:16 user nova-compute[71283]: spike Apr 20 10:24:16 user nova-compute[71283]: opentitan Apr 20 10:24:16 user nova-compute[71283]: sifive_u Apr 20 10:24:16 user nova-compute[71283]: sifive_e Apr 20 10:24:16 user nova-compute[71283]: virt Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 64 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-riscv64 Apr 20 10:24:16 user nova-compute[71283]: spike Apr 20 10:24:16 user nova-compute[71283]: microchip-icicle-kit Apr 20 10:24:16 user nova-compute[71283]: sifive_u Apr 20 10:24:16 user nova-compute[71283]: shakti_c Apr 20 10:24:16 user nova-compute[71283]: sifive_e Apr 20 10:24:16 user nova-compute[71283]: virt Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 64 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-s390x Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-jammy Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-4.0 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-5.2 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-artful Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-3.1 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-groovy Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-hirsute Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-disco Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-2.12 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-2.6 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-yakkety Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-eoan Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-2.9 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-6.0 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-5.1 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-3.0 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-4.2 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-2.5 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-2.11 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-xenial Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-focal Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-2.8 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-impish Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-bionic Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-5.0 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-6.2 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-zesty Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-4.1 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-cosmic Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-2.4 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-2.10 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-2.7 Apr 20 10:24:16 user nova-compute[71283]: s390-ccw-virtio-6.1 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-sh4 Apr 20 10:24:16 user nova-compute[71283]: shix Apr 20 10:24:16 user nova-compute[71283]: r2d Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 64 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-sh4eb Apr 20 10:24:16 user nova-compute[71283]: shix Apr 20 10:24:16 user nova-compute[71283]: r2d Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-sparc Apr 20 10:24:16 user nova-compute[71283]: SS-5 Apr 20 10:24:16 user nova-compute[71283]: SS-20 Apr 20 10:24:16 user nova-compute[71283]: LX Apr 20 10:24:16 user nova-compute[71283]: SPARCClassic Apr 20 10:24:16 user nova-compute[71283]: leon3_generic Apr 20 10:24:16 user nova-compute[71283]: SPARCbook Apr 20 10:24:16 user nova-compute[71283]: SS-4 Apr 20 10:24:16 user nova-compute[71283]: SS-600MP Apr 20 10:24:16 user nova-compute[71283]: SS-10 Apr 20 10:24:16 user nova-compute[71283]: Voyager Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 64 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-sparc64 Apr 20 10:24:16 user nova-compute[71283]: sun4u Apr 20 10:24:16 user nova-compute[71283]: niagara Apr 20 10:24:16 user nova-compute[71283]: sun4v Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 64 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-x86_64 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-jammy Apr 20 10:24:16 user nova-compute[71283]: ubuntu Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-impish-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-q35-5.2 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.12 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.0 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-xenial Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-6.2 Apr 20 10:24:16 user nova-compute[71283]: pc Apr 20 10:24:16 user nova-compute[71283]: pc-q35-4.2 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.5 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-4.2 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-hirsute Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-focal Apr 20 10:24:16 user nova-compute[71283]: pc-q35-xenial Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-jammy-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-5.2 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-1.5 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.7 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-eoan-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-zesty Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-disco-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-q35-groovy Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-groovy Apr 20 10:24:16 user nova-compute[71283]: pc-q35-artful Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-trusty Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.2 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-focal-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-eoan-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-q35-bionic-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-artful Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.7 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-6.1 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-yakkety Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.4 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-cosmic-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.10 Apr 20 10:24:16 user nova-compute[71283]: x-remote Apr 20 10:24:16 user nova-compute[71283]: pc-q35-5.1 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-1.7 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.9 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.11 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-3.1 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-6.1 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-4.1 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-jammy Apr 20 10:24:16 user nova-compute[71283]: ubuntu-q35 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.4 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-4.1 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-eoan Apr 20 10:24:16 user nova-compute[71283]: pc-q35-jammy-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-5.1 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.9 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-bionic-hpb Apr 20 10:24:16 user nova-compute[71283]: isapc Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-1.4 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-cosmic Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.6 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-3.1 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-bionic Apr 20 10:24:16 user nova-compute[71283]: pc-q35-disco-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-cosmic Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.12 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-bionic Apr 20 10:24:16 user nova-compute[71283]: pc-q35-groovy-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-q35-disco Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-cosmic-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.1 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-wily Apr 20 10:24:16 user nova-compute[71283]: pc-q35-impish Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.6 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-6.0 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-impish Apr 20 10:24:16 user nova-compute[71283]: pc-q35-impish-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-q35-hirsute Apr 20 10:24:16 user nova-compute[71283]: pc-q35-4.0.1 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-hirsute-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-1.6 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-5.0 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.8 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.10 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-3.0 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-zesty Apr 20 10:24:16 user nova-compute[71283]: pc-q35-4.0 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-focal Apr 20 10:24:16 user nova-compute[71283]: microvm Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-6.0 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.3 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-disco Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-focal-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-4.0 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-groovy-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-hirsute-hpb Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-5.0 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-2.8 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-6.2 Apr 20 10:24:16 user nova-compute[71283]: q35 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-eoan Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.5 Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-3.0 Apr 20 10:24:16 user nova-compute[71283]: pc-q35-yakkety Apr 20 10:24:16 user nova-compute[71283]: pc-q35-2.11 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-xtensa Apr 20 10:24:16 user nova-compute[71283]: sim Apr 20 10:24:16 user nova-compute[71283]: kc705 Apr 20 10:24:16 user nova-compute[71283]: ml605 Apr 20 10:24:16 user nova-compute[71283]: ml605-nommu Apr 20 10:24:16 user nova-compute[71283]: virt Apr 20 10:24:16 user nova-compute[71283]: lx60-nommu Apr 20 10:24:16 user nova-compute[71283]: lx200 Apr 20 10:24:16 user nova-compute[71283]: lx200-nommu Apr 20 10:24:16 user nova-compute[71283]: lx60 Apr 20 10:24:16 user nova-compute[71283]: kc705-nommu Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: hvm Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: 32 Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-xtensaeb Apr 20 10:24:16 user nova-compute[71283]: sim Apr 20 10:24:16 user nova-compute[71283]: kc705 Apr 20 10:24:16 user nova-compute[71283]: ml605 Apr 20 10:24:16 user nova-compute[71283]: ml605-nommu Apr 20 10:24:16 user nova-compute[71283]: virt Apr 20 10:24:16 user nova-compute[71283]: lx60-nommu Apr 20 10:24:16 user nova-compute[71283]: lx200 Apr 20 10:24:16 user nova-compute[71283]: lx200-nommu Apr 20 10:24:16 user nova-compute[71283]: lx60 Apr 20 10:24:16 user nova-compute[71283]: kc705-nommu Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for alpha via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch alpha / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-alpha' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for armv6l via machine types: {None, 'virt'} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for armv7l via machine types: {'virt'} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch armv7l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for aarch64 via machine types: {'virt'} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch aarch64 / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-aarch64' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for cris via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch cris / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-cris' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for i686 via machine types: {'pc', 'ubuntu', 'ubuntu-q35', 'q35'} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-i386 Apr 20 10:24:16 user nova-compute[71283]: kvm Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-6.2 Apr 20 10:24:16 user nova-compute[71283]: i686 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: rom Apr 20 10:24:16 user nova-compute[71283]: pflash Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: yes Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: on Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: Intel Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: qemu64 Apr 20 10:24:16 user nova-compute[71283]: qemu32 Apr 20 10:24:16 user nova-compute[71283]: phenom Apr 20 10:24:16 user nova-compute[71283]: pentium3 Apr 20 10:24:16 user nova-compute[71283]: pentium2 Apr 20 10:24:16 user nova-compute[71283]: pentium Apr 20 10:24:16 user nova-compute[71283]: n270 Apr 20 10:24:16 user nova-compute[71283]: kvm64 Apr 20 10:24:16 user nova-compute[71283]: kvm32 Apr 20 10:24:16 user nova-compute[71283]: coreduo Apr 20 10:24:16 user nova-compute[71283]: core2duo Apr 20 10:24:16 user nova-compute[71283]: athlon Apr 20 10:24:16 user nova-compute[71283]: Westmere-IBRS Apr 20 10:24:16 user nova-compute[71283]: Westmere Apr 20 10:24:16 user nova-compute[71283]: Snowridge Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client Apr 20 10:24:16 user nova-compute[71283]: SandyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: SandyBridge Apr 20 10:24:16 user nova-compute[71283]: Penryn Apr 20 10:24:16 user nova-compute[71283]: Opteron_G5 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G4 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G3 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G2 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G1 Apr 20 10:24:16 user nova-compute[71283]: Nehalem-IBRS Apr 20 10:24:16 user nova-compute[71283]: Nehalem Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: IvyBridge Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Haswell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell Apr 20 10:24:16 user nova-compute[71283]: EPYC-Rome Apr 20 10:24:16 user nova-compute[71283]: EPYC-Milan Apr 20 10:24:16 user nova-compute[71283]: EPYC-IBPB Apr 20 10:24:16 user nova-compute[71283]: EPYC Apr 20 10:24:16 user nova-compute[71283]: Dhyana Apr 20 10:24:16 user nova-compute[71283]: Cooperlake Apr 20 10:24:16 user nova-compute[71283]: Conroe Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Broadwell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell Apr 20 10:24:16 user nova-compute[71283]: 486 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: file Apr 20 10:24:16 user nova-compute[71283]: anonymous Apr 20 10:24:16 user nova-compute[71283]: memfd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: disk Apr 20 10:24:16 user nova-compute[71283]: cdrom Apr 20 10:24:16 user nova-compute[71283]: floppy Apr 20 10:24:16 user nova-compute[71283]: lun Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: ide Apr 20 10:24:16 user nova-compute[71283]: fdc Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: sata Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: sdl Apr 20 10:24:16 user nova-compute[71283]: vnc Apr 20 10:24:16 user nova-compute[71283]: spice Apr 20 10:24:16 user nova-compute[71283]: egl-headless Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: subsystem Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: default Apr 20 10:24:16 user nova-compute[71283]: mandatory Apr 20 10:24:16 user nova-compute[71283]: requisite Apr 20 10:24:16 user nova-compute[71283]: optional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: pci Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: random Apr 20 10:24:16 user nova-compute[71283]: egd Apr 20 10:24:16 user nova-compute[71283]: builtin Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: path Apr 20 10:24:16 user nova-compute[71283]: handle Apr 20 10:24:16 user nova-compute[71283]: virtiofs Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: tpm-tis Apr 20 10:24:16 user nova-compute[71283]: tpm-crb Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: passthrough Apr 20 10:24:16 user nova-compute[71283]: emulator Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: {{(pid=71283) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-i386 Apr 20 10:24:16 user nova-compute[71283]: kvm Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-jammy Apr 20 10:24:16 user nova-compute[71283]: i686 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: rom Apr 20 10:24:16 user nova-compute[71283]: pflash Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: yes Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: on Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: Intel Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: qemu64 Apr 20 10:24:16 user nova-compute[71283]: qemu32 Apr 20 10:24:16 user nova-compute[71283]: phenom Apr 20 10:24:16 user nova-compute[71283]: pentium3 Apr 20 10:24:16 user nova-compute[71283]: pentium2 Apr 20 10:24:16 user nova-compute[71283]: pentium Apr 20 10:24:16 user nova-compute[71283]: n270 Apr 20 10:24:16 user nova-compute[71283]: kvm64 Apr 20 10:24:16 user nova-compute[71283]: kvm32 Apr 20 10:24:16 user nova-compute[71283]: coreduo Apr 20 10:24:16 user nova-compute[71283]: core2duo Apr 20 10:24:16 user nova-compute[71283]: athlon Apr 20 10:24:16 user nova-compute[71283]: Westmere-IBRS Apr 20 10:24:16 user nova-compute[71283]: Westmere Apr 20 10:24:16 user nova-compute[71283]: Snowridge Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client Apr 20 10:24:16 user nova-compute[71283]: SandyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: SandyBridge Apr 20 10:24:16 user nova-compute[71283]: Penryn Apr 20 10:24:16 user nova-compute[71283]: Opteron_G5 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G4 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G3 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G2 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G1 Apr 20 10:24:16 user nova-compute[71283]: Nehalem-IBRS Apr 20 10:24:16 user nova-compute[71283]: Nehalem Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: IvyBridge Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Haswell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell Apr 20 10:24:16 user nova-compute[71283]: EPYC-Rome Apr 20 10:24:16 user nova-compute[71283]: EPYC-Milan Apr 20 10:24:16 user nova-compute[71283]: EPYC-IBPB Apr 20 10:24:16 user nova-compute[71283]: EPYC Apr 20 10:24:16 user nova-compute[71283]: Dhyana Apr 20 10:24:16 user nova-compute[71283]: Cooperlake Apr 20 10:24:16 user nova-compute[71283]: Conroe Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Broadwell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell Apr 20 10:24:16 user nova-compute[71283]: 486 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: file Apr 20 10:24:16 user nova-compute[71283]: anonymous Apr 20 10:24:16 user nova-compute[71283]: memfd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: disk Apr 20 10:24:16 user nova-compute[71283]: cdrom Apr 20 10:24:16 user nova-compute[71283]: floppy Apr 20 10:24:16 user nova-compute[71283]: lun Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: ide Apr 20 10:24:16 user nova-compute[71283]: fdc Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: sata Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: sdl Apr 20 10:24:16 user nova-compute[71283]: vnc Apr 20 10:24:16 user nova-compute[71283]: spice Apr 20 10:24:16 user nova-compute[71283]: egl-headless Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: subsystem Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: default Apr 20 10:24:16 user nova-compute[71283]: mandatory Apr 20 10:24:16 user nova-compute[71283]: requisite Apr 20 10:24:16 user nova-compute[71283]: optional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: pci Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: random Apr 20 10:24:16 user nova-compute[71283]: egd Apr 20 10:24:16 user nova-compute[71283]: builtin Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: path Apr 20 10:24:16 user nova-compute[71283]: handle Apr 20 10:24:16 user nova-compute[71283]: virtiofs Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: tpm-tis Apr 20 10:24:16 user nova-compute[71283]: tpm-crb Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: passthrough Apr 20 10:24:16 user nova-compute[71283]: emulator Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: {{(pid=71283) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu-q35: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-i386 Apr 20 10:24:16 user nova-compute[71283]: kvm Apr 20 10:24:16 user nova-compute[71283]: pc-q35-jammy Apr 20 10:24:16 user nova-compute[71283]: i686 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: rom Apr 20 10:24:16 user nova-compute[71283]: pflash Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: yes Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: on Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: Intel Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: qemu64 Apr 20 10:24:16 user nova-compute[71283]: qemu32 Apr 20 10:24:16 user nova-compute[71283]: phenom Apr 20 10:24:16 user nova-compute[71283]: pentium3 Apr 20 10:24:16 user nova-compute[71283]: pentium2 Apr 20 10:24:16 user nova-compute[71283]: pentium Apr 20 10:24:16 user nova-compute[71283]: n270 Apr 20 10:24:16 user nova-compute[71283]: kvm64 Apr 20 10:24:16 user nova-compute[71283]: kvm32 Apr 20 10:24:16 user nova-compute[71283]: coreduo Apr 20 10:24:16 user nova-compute[71283]: core2duo Apr 20 10:24:16 user nova-compute[71283]: athlon Apr 20 10:24:16 user nova-compute[71283]: Westmere-IBRS Apr 20 10:24:16 user nova-compute[71283]: Westmere Apr 20 10:24:16 user nova-compute[71283]: Snowridge Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client Apr 20 10:24:16 user nova-compute[71283]: SandyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: SandyBridge Apr 20 10:24:16 user nova-compute[71283]: Penryn Apr 20 10:24:16 user nova-compute[71283]: Opteron_G5 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G4 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G3 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G2 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G1 Apr 20 10:24:16 user nova-compute[71283]: Nehalem-IBRS Apr 20 10:24:16 user nova-compute[71283]: Nehalem Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: IvyBridge Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Haswell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell Apr 20 10:24:16 user nova-compute[71283]: EPYC-Rome Apr 20 10:24:16 user nova-compute[71283]: EPYC-Milan Apr 20 10:24:16 user nova-compute[71283]: EPYC-IBPB Apr 20 10:24:16 user nova-compute[71283]: EPYC Apr 20 10:24:16 user nova-compute[71283]: Dhyana Apr 20 10:24:16 user nova-compute[71283]: Cooperlake Apr 20 10:24:16 user nova-compute[71283]: Conroe Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Broadwell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell Apr 20 10:24:16 user nova-compute[71283]: 486 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: file Apr 20 10:24:16 user nova-compute[71283]: anonymous Apr 20 10:24:16 user nova-compute[71283]: memfd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: disk Apr 20 10:24:16 user nova-compute[71283]: cdrom Apr 20 10:24:16 user nova-compute[71283]: floppy Apr 20 10:24:16 user nova-compute[71283]: lun Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: fdc Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: sata Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: sdl Apr 20 10:24:16 user nova-compute[71283]: vnc Apr 20 10:24:16 user nova-compute[71283]: spice Apr 20 10:24:16 user nova-compute[71283]: egl-headless Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: subsystem Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: default Apr 20 10:24:16 user nova-compute[71283]: mandatory Apr 20 10:24:16 user nova-compute[71283]: requisite Apr 20 10:24:16 user nova-compute[71283]: optional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: pci Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: random Apr 20 10:24:16 user nova-compute[71283]: egd Apr 20 10:24:16 user nova-compute[71283]: builtin Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: path Apr 20 10:24:16 user nova-compute[71283]: handle Apr 20 10:24:16 user nova-compute[71283]: virtiofs Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: tpm-tis Apr 20 10:24:16 user nova-compute[71283]: tpm-crb Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: passthrough Apr 20 10:24:16 user nova-compute[71283]: emulator Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: {{(pid=71283) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-i386 Apr 20 10:24:16 user nova-compute[71283]: kvm Apr 20 10:24:16 user nova-compute[71283]: pc-q35-6.2 Apr 20 10:24:16 user nova-compute[71283]: i686 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: rom Apr 20 10:24:16 user nova-compute[71283]: pflash Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: yes Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: on Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: Intel Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: qemu64 Apr 20 10:24:16 user nova-compute[71283]: qemu32 Apr 20 10:24:16 user nova-compute[71283]: phenom Apr 20 10:24:16 user nova-compute[71283]: pentium3 Apr 20 10:24:16 user nova-compute[71283]: pentium2 Apr 20 10:24:16 user nova-compute[71283]: pentium Apr 20 10:24:16 user nova-compute[71283]: n270 Apr 20 10:24:16 user nova-compute[71283]: kvm64 Apr 20 10:24:16 user nova-compute[71283]: kvm32 Apr 20 10:24:16 user nova-compute[71283]: coreduo Apr 20 10:24:16 user nova-compute[71283]: core2duo Apr 20 10:24:16 user nova-compute[71283]: athlon Apr 20 10:24:16 user nova-compute[71283]: Westmere-IBRS Apr 20 10:24:16 user nova-compute[71283]: Westmere Apr 20 10:24:16 user nova-compute[71283]: Snowridge Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client Apr 20 10:24:16 user nova-compute[71283]: SandyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: SandyBridge Apr 20 10:24:16 user nova-compute[71283]: Penryn Apr 20 10:24:16 user nova-compute[71283]: Opteron_G5 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G4 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G3 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G2 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G1 Apr 20 10:24:16 user nova-compute[71283]: Nehalem-IBRS Apr 20 10:24:16 user nova-compute[71283]: Nehalem Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: IvyBridge Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Haswell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell Apr 20 10:24:16 user nova-compute[71283]: EPYC-Rome Apr 20 10:24:16 user nova-compute[71283]: EPYC-Milan Apr 20 10:24:16 user nova-compute[71283]: EPYC-IBPB Apr 20 10:24:16 user nova-compute[71283]: EPYC Apr 20 10:24:16 user nova-compute[71283]: Dhyana Apr 20 10:24:16 user nova-compute[71283]: Cooperlake Apr 20 10:24:16 user nova-compute[71283]: Conroe Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Broadwell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell Apr 20 10:24:16 user nova-compute[71283]: 486 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: file Apr 20 10:24:16 user nova-compute[71283]: anonymous Apr 20 10:24:16 user nova-compute[71283]: memfd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: disk Apr 20 10:24:16 user nova-compute[71283]: cdrom Apr 20 10:24:16 user nova-compute[71283]: floppy Apr 20 10:24:16 user nova-compute[71283]: lun Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: fdc Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: sata Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: sdl Apr 20 10:24:16 user nova-compute[71283]: vnc Apr 20 10:24:16 user nova-compute[71283]: spice Apr 20 10:24:16 user nova-compute[71283]: egl-headless Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: subsystem Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: default Apr 20 10:24:16 user nova-compute[71283]: mandatory Apr 20 10:24:16 user nova-compute[71283]: requisite Apr 20 10:24:16 user nova-compute[71283]: optional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: pci Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: random Apr 20 10:24:16 user nova-compute[71283]: egd Apr 20 10:24:16 user nova-compute[71283]: builtin Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: path Apr 20 10:24:16 user nova-compute[71283]: handle Apr 20 10:24:16 user nova-compute[71283]: virtiofs Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: tpm-tis Apr 20 10:24:16 user nova-compute[71283]: tpm-crb Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: passthrough Apr 20 10:24:16 user nova-compute[71283]: emulator Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: {{(pid=71283) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for m68k via machine types: {None, 'virt'} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for microblaze via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch microblaze / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblaze' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for microblazeel via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch microblazeel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblazeel' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for mips via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch mips / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for mipsel via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch mipsel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mipsel' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for mips64 via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch mips64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for mips64el via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch mips64el / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64el' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for ppc via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch ppc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for ppc64 via machine types: {'powernv', 'pseries', None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for ppc64le via machine types: {'powernv', 'pseries'} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for riscv32 via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch riscv32 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv32' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for riscv64 via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch riscv64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv64' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for s390x via machine types: {'s390-ccw-virtio'} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch s390x / virt_type kvm / machine_type s390-ccw-virtio: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-s390x' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for sh4 via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch sh4 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for sh4eb via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch sh4eb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4eb' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for sparc via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch sparc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for sparc64 via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch sparc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc64' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for x86_64 via machine types: {'pc', 'ubuntu', 'ubuntu-q35', 'q35'} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-x86_64 Apr 20 10:24:16 user nova-compute[71283]: kvm Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-6.2 Apr 20 10:24:16 user nova-compute[71283]: x86_64 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: efi Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: rom Apr 20 10:24:16 user nova-compute[71283]: pflash Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: yes Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: on Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: on Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: Intel Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: qemu64 Apr 20 10:24:16 user nova-compute[71283]: qemu32 Apr 20 10:24:16 user nova-compute[71283]: phenom Apr 20 10:24:16 user nova-compute[71283]: pentium3 Apr 20 10:24:16 user nova-compute[71283]: pentium2 Apr 20 10:24:16 user nova-compute[71283]: pentium Apr 20 10:24:16 user nova-compute[71283]: n270 Apr 20 10:24:16 user nova-compute[71283]: kvm64 Apr 20 10:24:16 user nova-compute[71283]: kvm32 Apr 20 10:24:16 user nova-compute[71283]: coreduo Apr 20 10:24:16 user nova-compute[71283]: core2duo Apr 20 10:24:16 user nova-compute[71283]: athlon Apr 20 10:24:16 user nova-compute[71283]: Westmere-IBRS Apr 20 10:24:16 user nova-compute[71283]: Westmere Apr 20 10:24:16 user nova-compute[71283]: Snowridge Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client Apr 20 10:24:16 user nova-compute[71283]: SandyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: SandyBridge Apr 20 10:24:16 user nova-compute[71283]: Penryn Apr 20 10:24:16 user nova-compute[71283]: Opteron_G5 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G4 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G3 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G2 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G1 Apr 20 10:24:16 user nova-compute[71283]: Nehalem-IBRS Apr 20 10:24:16 user nova-compute[71283]: Nehalem Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: IvyBridge Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Haswell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell Apr 20 10:24:16 user nova-compute[71283]: EPYC-Rome Apr 20 10:24:16 user nova-compute[71283]: EPYC-Milan Apr 20 10:24:16 user nova-compute[71283]: EPYC-IBPB Apr 20 10:24:16 user nova-compute[71283]: EPYC Apr 20 10:24:16 user nova-compute[71283]: Dhyana Apr 20 10:24:16 user nova-compute[71283]: Cooperlake Apr 20 10:24:16 user nova-compute[71283]: Conroe Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Broadwell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell Apr 20 10:24:16 user nova-compute[71283]: 486 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: file Apr 20 10:24:16 user nova-compute[71283]: anonymous Apr 20 10:24:16 user nova-compute[71283]: memfd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: disk Apr 20 10:24:16 user nova-compute[71283]: cdrom Apr 20 10:24:16 user nova-compute[71283]: floppy Apr 20 10:24:16 user nova-compute[71283]: lun Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: ide Apr 20 10:24:16 user nova-compute[71283]: fdc Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: sata Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: sdl Apr 20 10:24:16 user nova-compute[71283]: vnc Apr 20 10:24:16 user nova-compute[71283]: spice Apr 20 10:24:16 user nova-compute[71283]: egl-headless Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: subsystem Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: default Apr 20 10:24:16 user nova-compute[71283]: mandatory Apr 20 10:24:16 user nova-compute[71283]: requisite Apr 20 10:24:16 user nova-compute[71283]: optional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: pci Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: random Apr 20 10:24:16 user nova-compute[71283]: egd Apr 20 10:24:16 user nova-compute[71283]: builtin Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: path Apr 20 10:24:16 user nova-compute[71283]: handle Apr 20 10:24:16 user nova-compute[71283]: virtiofs Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: tpm-tis Apr 20 10:24:16 user nova-compute[71283]: tpm-crb Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: passthrough Apr 20 10:24:16 user nova-compute[71283]: emulator Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: {{(pid=71283) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-x86_64 Apr 20 10:24:16 user nova-compute[71283]: kvm Apr 20 10:24:16 user nova-compute[71283]: pc-i440fx-jammy Apr 20 10:24:16 user nova-compute[71283]: x86_64 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: efi Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: rom Apr 20 10:24:16 user nova-compute[71283]: pflash Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: yes Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: on Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: on Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: Intel Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: qemu64 Apr 20 10:24:16 user nova-compute[71283]: qemu32 Apr 20 10:24:16 user nova-compute[71283]: phenom Apr 20 10:24:16 user nova-compute[71283]: pentium3 Apr 20 10:24:16 user nova-compute[71283]: pentium2 Apr 20 10:24:16 user nova-compute[71283]: pentium Apr 20 10:24:16 user nova-compute[71283]: n270 Apr 20 10:24:16 user nova-compute[71283]: kvm64 Apr 20 10:24:16 user nova-compute[71283]: kvm32 Apr 20 10:24:16 user nova-compute[71283]: coreduo Apr 20 10:24:16 user nova-compute[71283]: core2duo Apr 20 10:24:16 user nova-compute[71283]: athlon Apr 20 10:24:16 user nova-compute[71283]: Westmere-IBRS Apr 20 10:24:16 user nova-compute[71283]: Westmere Apr 20 10:24:16 user nova-compute[71283]: Snowridge Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client Apr 20 10:24:16 user nova-compute[71283]: SandyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: SandyBridge Apr 20 10:24:16 user nova-compute[71283]: Penryn Apr 20 10:24:16 user nova-compute[71283]: Opteron_G5 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G4 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G3 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G2 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G1 Apr 20 10:24:16 user nova-compute[71283]: Nehalem-IBRS Apr 20 10:24:16 user nova-compute[71283]: Nehalem Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: IvyBridge Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Haswell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell Apr 20 10:24:16 user nova-compute[71283]: EPYC-Rome Apr 20 10:24:16 user nova-compute[71283]: EPYC-Milan Apr 20 10:24:16 user nova-compute[71283]: EPYC-IBPB Apr 20 10:24:16 user nova-compute[71283]: EPYC Apr 20 10:24:16 user nova-compute[71283]: Dhyana Apr 20 10:24:16 user nova-compute[71283]: Cooperlake Apr 20 10:24:16 user nova-compute[71283]: Conroe Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Broadwell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell Apr 20 10:24:16 user nova-compute[71283]: 486 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: file Apr 20 10:24:16 user nova-compute[71283]: anonymous Apr 20 10:24:16 user nova-compute[71283]: memfd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: disk Apr 20 10:24:16 user nova-compute[71283]: cdrom Apr 20 10:24:16 user nova-compute[71283]: floppy Apr 20 10:24:16 user nova-compute[71283]: lun Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: ide Apr 20 10:24:16 user nova-compute[71283]: fdc Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: sata Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: sdl Apr 20 10:24:16 user nova-compute[71283]: vnc Apr 20 10:24:16 user nova-compute[71283]: spice Apr 20 10:24:16 user nova-compute[71283]: egl-headless Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: subsystem Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: default Apr 20 10:24:16 user nova-compute[71283]: mandatory Apr 20 10:24:16 user nova-compute[71283]: requisite Apr 20 10:24:16 user nova-compute[71283]: optional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: pci Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: random Apr 20 10:24:16 user nova-compute[71283]: egd Apr 20 10:24:16 user nova-compute[71283]: builtin Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: path Apr 20 10:24:16 user nova-compute[71283]: handle Apr 20 10:24:16 user nova-compute[71283]: virtiofs Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: tpm-tis Apr 20 10:24:16 user nova-compute[71283]: tpm-crb Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: passthrough Apr 20 10:24:16 user nova-compute[71283]: emulator Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: {{(pid=71283) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu-q35: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-x86_64 Apr 20 10:24:16 user nova-compute[71283]: kvm Apr 20 10:24:16 user nova-compute[71283]: pc-q35-jammy Apr 20 10:24:16 user nova-compute[71283]: x86_64 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: efi Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: rom Apr 20 10:24:16 user nova-compute[71283]: pflash Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: yes Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: yes Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: on Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: on Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: Intel Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: qemu64 Apr 20 10:24:16 user nova-compute[71283]: qemu32 Apr 20 10:24:16 user nova-compute[71283]: phenom Apr 20 10:24:16 user nova-compute[71283]: pentium3 Apr 20 10:24:16 user nova-compute[71283]: pentium2 Apr 20 10:24:16 user nova-compute[71283]: pentium Apr 20 10:24:16 user nova-compute[71283]: n270 Apr 20 10:24:16 user nova-compute[71283]: kvm64 Apr 20 10:24:16 user nova-compute[71283]: kvm32 Apr 20 10:24:16 user nova-compute[71283]: coreduo Apr 20 10:24:16 user nova-compute[71283]: core2duo Apr 20 10:24:16 user nova-compute[71283]: athlon Apr 20 10:24:16 user nova-compute[71283]: Westmere-IBRS Apr 20 10:24:16 user nova-compute[71283]: Westmere Apr 20 10:24:16 user nova-compute[71283]: Snowridge Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client Apr 20 10:24:16 user nova-compute[71283]: SandyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: SandyBridge Apr 20 10:24:16 user nova-compute[71283]: Penryn Apr 20 10:24:16 user nova-compute[71283]: Opteron_G5 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G4 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G3 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G2 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G1 Apr 20 10:24:16 user nova-compute[71283]: Nehalem-IBRS Apr 20 10:24:16 user nova-compute[71283]: Nehalem Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: IvyBridge Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Haswell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell Apr 20 10:24:16 user nova-compute[71283]: EPYC-Rome Apr 20 10:24:16 user nova-compute[71283]: EPYC-Milan Apr 20 10:24:16 user nova-compute[71283]: EPYC-IBPB Apr 20 10:24:16 user nova-compute[71283]: EPYC Apr 20 10:24:16 user nova-compute[71283]: Dhyana Apr 20 10:24:16 user nova-compute[71283]: Cooperlake Apr 20 10:24:16 user nova-compute[71283]: Conroe Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Broadwell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell Apr 20 10:24:16 user nova-compute[71283]: 486 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: file Apr 20 10:24:16 user nova-compute[71283]: anonymous Apr 20 10:24:16 user nova-compute[71283]: memfd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: disk Apr 20 10:24:16 user nova-compute[71283]: cdrom Apr 20 10:24:16 user nova-compute[71283]: floppy Apr 20 10:24:16 user nova-compute[71283]: lun Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: fdc Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: sata Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: sdl Apr 20 10:24:16 user nova-compute[71283]: vnc Apr 20 10:24:16 user nova-compute[71283]: spice Apr 20 10:24:16 user nova-compute[71283]: egl-headless Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: subsystem Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: default Apr 20 10:24:16 user nova-compute[71283]: mandatory Apr 20 10:24:16 user nova-compute[71283]: requisite Apr 20 10:24:16 user nova-compute[71283]: optional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: pci Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: random Apr 20 10:24:16 user nova-compute[71283]: egd Apr 20 10:24:16 user nova-compute[71283]: builtin Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: path Apr 20 10:24:16 user nova-compute[71283]: handle Apr 20 10:24:16 user nova-compute[71283]: virtiofs Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: tpm-tis Apr 20 10:24:16 user nova-compute[71283]: tpm-crb Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: passthrough Apr 20 10:24:16 user nova-compute[71283]: emulator Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: {{(pid=71283) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/bin/qemu-system-x86_64 Apr 20 10:24:16 user nova-compute[71283]: kvm Apr 20 10:24:16 user nova-compute[71283]: pc-q35-6.2 Apr 20 10:24:16 user nova-compute[71283]: x86_64 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: efi Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 20 10:24:16 user nova-compute[71283]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: rom Apr 20 10:24:16 user nova-compute[71283]: pflash Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: yes Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: yes Apr 20 10:24:16 user nova-compute[71283]: no Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: on Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: on Apr 20 10:24:16 user nova-compute[71283]: off Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: Intel Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: qemu64 Apr 20 10:24:16 user nova-compute[71283]: qemu32 Apr 20 10:24:16 user nova-compute[71283]: phenom Apr 20 10:24:16 user nova-compute[71283]: pentium3 Apr 20 10:24:16 user nova-compute[71283]: pentium2 Apr 20 10:24:16 user nova-compute[71283]: pentium Apr 20 10:24:16 user nova-compute[71283]: n270 Apr 20 10:24:16 user nova-compute[71283]: kvm64 Apr 20 10:24:16 user nova-compute[71283]: kvm32 Apr 20 10:24:16 user nova-compute[71283]: coreduo Apr 20 10:24:16 user nova-compute[71283]: core2duo Apr 20 10:24:16 user nova-compute[71283]: athlon Apr 20 10:24:16 user nova-compute[71283]: Westmere-IBRS Apr 20 10:24:16 user nova-compute[71283]: Westmere Apr 20 10:24:16 user nova-compute[71283]: Snowridge Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Server Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client-IBRS Apr 20 10:24:16 user nova-compute[71283]: Skylake-Client Apr 20 10:24:16 user nova-compute[71283]: SandyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: SandyBridge Apr 20 10:24:16 user nova-compute[71283]: Penryn Apr 20 10:24:16 user nova-compute[71283]: Opteron_G5 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G4 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G3 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G2 Apr 20 10:24:16 user nova-compute[71283]: Opteron_G1 Apr 20 10:24:16 user nova-compute[71283]: Nehalem-IBRS Apr 20 10:24:16 user nova-compute[71283]: Nehalem Apr 20 10:24:16 user nova-compute[71283]: IvyBridge-IBRS Apr 20 10:24:16 user nova-compute[71283]: IvyBridge Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Server Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client-noTSX Apr 20 10:24:16 user nova-compute[71283]: Icelake-Client Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Haswell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Haswell Apr 20 10:24:16 user nova-compute[71283]: EPYC-Rome Apr 20 10:24:16 user nova-compute[71283]: EPYC-Milan Apr 20 10:24:16 user nova-compute[71283]: EPYC-IBPB Apr 20 10:24:16 user nova-compute[71283]: EPYC Apr 20 10:24:16 user nova-compute[71283]: Dhyana Apr 20 10:24:16 user nova-compute[71283]: Cooperlake Apr 20 10:24:16 user nova-compute[71283]: Conroe Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server-noTSX Apr 20 10:24:16 user nova-compute[71283]: Cascadelake-Server Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell-noTSX Apr 20 10:24:16 user nova-compute[71283]: Broadwell-IBRS Apr 20 10:24:16 user nova-compute[71283]: Broadwell Apr 20 10:24:16 user nova-compute[71283]: 486 Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: file Apr 20 10:24:16 user nova-compute[71283]: anonymous Apr 20 10:24:16 user nova-compute[71283]: memfd Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: disk Apr 20 10:24:16 user nova-compute[71283]: cdrom Apr 20 10:24:16 user nova-compute[71283]: floppy Apr 20 10:24:16 user nova-compute[71283]: lun Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: fdc Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: sata Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: sdl Apr 20 10:24:16 user nova-compute[71283]: vnc Apr 20 10:24:16 user nova-compute[71283]: spice Apr 20 10:24:16 user nova-compute[71283]: egl-headless Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: subsystem Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: default Apr 20 10:24:16 user nova-compute[71283]: mandatory Apr 20 10:24:16 user nova-compute[71283]: requisite Apr 20 10:24:16 user nova-compute[71283]: optional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: usb Apr 20 10:24:16 user nova-compute[71283]: pci Apr 20 10:24:16 user nova-compute[71283]: scsi Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: virtio Apr 20 10:24:16 user nova-compute[71283]: virtio-transitional Apr 20 10:24:16 user nova-compute[71283]: virtio-non-transitional Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: random Apr 20 10:24:16 user nova-compute[71283]: egd Apr 20 10:24:16 user nova-compute[71283]: builtin Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: path Apr 20 10:24:16 user nova-compute[71283]: handle Apr 20 10:24:16 user nova-compute[71283]: virtiofs Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: tpm-tis Apr 20 10:24:16 user nova-compute[71283]: tpm-crb Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: passthrough Apr 20 10:24:16 user nova-compute[71283]: emulator Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: {{(pid=71283) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for xtensa via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch xtensa / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensa' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Getting domain capabilities for xtensaeb via machine types: {None} {{(pid=71283) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Error from libvirt when retrieving domain capabilities for arch xtensaeb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensaeb' on this host {{(pid=71283) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Checking secure boot support for host arch (x86_64) {{(pid=71283) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Checking secure boot support for host arch (x86_64) {{(pid=71283) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Checking secure boot support for host arch (x86_64) {{(pid=71283) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Checking secure boot support for host arch (x86_64) {{(pid=71283) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Checking secure boot support for host arch (x86_64) {{(pid=71283) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 20 10:24:16 user nova-compute[71283]: INFO nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Secure Boot support detected Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] cpu compare xml: Apr 20 10:24:16 user nova-compute[71283]: Nehalem Apr 20 10:24:16 user nova-compute[71283]: Apr 20 10:24:16 user nova-compute[71283]: {{(pid=71283) _compare_cpu /opt/stack/nova/nova/virt/libvirt/driver.py:9996}} Apr 20 10:24:16 user nova-compute[71283]: INFO nova.virt.node [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Generated node identity bdbc83bd-9307-4e20-8e3d-430b77499399 Apr 20 10:24:16 user nova-compute[71283]: INFO nova.virt.node [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Wrote node identity bdbc83bd-9307-4e20-8e3d-430b77499399 to /opt/stack/data/nova/compute_id Apr 20 10:24:16 user nova-compute[71283]: WARNING nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Compute nodes ['bdbc83bd-9307-4e20-8e3d-430b77499399'] for host user were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. Apr 20 10:24:16 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host Apr 20 10:24:16 user nova-compute[71283]: WARNING nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] No compute node record found for host user. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 20 10:24:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:24:16 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:24:16 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:24:16 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Hypervisor/Node resource view: name=user free_ram=10846MB free_disk=27.02054214477539GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:24:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:24:16 user nova-compute[71283]: WARNING nova.compute.resource_tracker [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] No compute node record for user:bdbc83bd-9307-4e20-8e3d-430b77499399: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host bdbc83bd-9307-4e20-8e3d-430b77499399 could not be found. Apr 20 10:24:16 user nova-compute[71283]: INFO nova.compute.resource_tracker [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Compute node record created for user:user with uuid: bdbc83bd-9307-4e20-8e3d-430b77499399 Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:24:17 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [req-660917c2-7374-4b34-890f-09fbd2b46276] Created resource provider record via placement API for resource provider with UUID bdbc83bd-9307-4e20-8e3d-430b77499399 and name user. Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] /sys/module/kvm_amd/parameters/sev does not exist {{(pid=71283) _kernel_supports_amd_sev /opt/stack/nova/nova/virt/libvirt/host.py:1766}} Apr 20 10:24:17 user nova-compute[71283]: INFO nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] kernel doesn't support AMD SEV Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Updating inventory in ProviderTree for provider bdbc83bd-9307-4e20-8e3d-430b77499399 with inventory: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Libvirt baseline CPU Apr 20 10:24:17 user nova-compute[71283]: x86_64 Apr 20 10:24:17 user nova-compute[71283]: Nehalem Apr 20 10:24:17 user nova-compute[71283]: Intel Apr 20 10:24:17 user nova-compute[71283]: Apr 20 10:24:17 user nova-compute[71283]: Apr 20 10:24:17 user nova-compute[71283]: {{(pid=71283) _get_guest_baseline_cpu_features /opt/stack/nova/nova/virt/libvirt/driver.py:12486}} Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Updated inventory for provider bdbc83bd-9307-4e20-8e3d-430b77499399 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Updating resource provider bdbc83bd-9307-4e20-8e3d-430b77499399 generation from 0 to 1 during operation: update_inventory {{(pid=71283) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Updating inventory in ProviderTree for provider bdbc83bd-9307-4e20-8e3d-430b77499399 with inventory: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Updating resource provider bdbc83bd-9307-4e20-8e3d-430b77499399 generation from 1 to 2 during operation: update_traits {{(pid=71283) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:24:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.662s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.service [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Creating RPC server for service compute {{(pid=71283) start /opt/stack/nova/nova/service.py:182}} Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.service [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Join ServiceGroup membership for this service compute {{(pid=71283) start /opt/stack/nova/nova/service.py:199}} Apr 20 10:24:17 user nova-compute[71283]: DEBUG nova.servicegroup.drivers.db [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] DB_Driver: join new ServiceGroup member user to the compute group, service = {{(pid=71283) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} Apr 20 10:24:32 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_power_states {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:24:32 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Didn't find any instances for network info cache update. {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:25:08 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:25:09 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:25:09 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:25:09 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=10223MB free_disk=26.934364318847656GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:25:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:25:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:25:09 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:25:09 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:25:09 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:25:09 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:25:09 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:25:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.286s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Didn't find any instances for network info cache update. {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:26:09 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:26:09 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:26:09 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=10223MB free_disk=26.980510711669922GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:26:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:26:10 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:26:10 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:26:10 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:26:10 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:26:10 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:27:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:27:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:27:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:27:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:27:09 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:27:09 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:27:09 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Didn't find any instances for network info cache update. {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 10:27:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:27:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:27:09 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:27:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:27:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:27:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:27:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:27:09 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:27:10 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:27:10 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:27:10 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=10251MB free_disk=26.756484985351562GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:27:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:27:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:27:10 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:27:10 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:27:10 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:27:10 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:27:10 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:27:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:27:12 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:27:12 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:28:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:28:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:28:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:28:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:28:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:09 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:28:10 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:10 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:10 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9474MB free_disk=26.77846908569336GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:28:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:10 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:28:10 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:28:10 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:28:10 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:28:10 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:28:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.106s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:11 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:28:11 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:28:11 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:28:11 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Didn't find any instances for network info cache update. {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 10:28:11 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:28:11 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:28:11 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:28:12 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:28:12 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:28:13 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:28:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Acquiring lock "cf494c03-d188-49c7-879e-29d2fa555549" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "cf494c03-d188-49c7-879e-29d2fa555549" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:28:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:21 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:28:21 user nova-compute[71283]: INFO nova.compute.claims [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Claim successful on node user Apr 20 10:28:22 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:28:22 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:28:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.311s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:28:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:28:22 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:28:22 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:28:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:28:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:28:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:28:22 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Creating image(s) Apr 20 10:28:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Acquiring lock "/opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "/opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "/opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113.part --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:23 user nova-compute[71283]: DEBUG nova.policy [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '25f64a72e9ec4ad599a5c63bec4d092e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f986df042c594f71a4db3da582def690', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:28:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113.part --force-share --output=json" returned: 0 in 0.129s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:23 user nova-compute[71283]: DEBUG nova.virt.images [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] 3687b278-c2bc-46f2-9b7e-579c8d06fe41 was qcow2, converting to raw {{(pid=71283) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 20 10:28:23 user nova-compute[71283]: DEBUG nova.privsep.utils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71283) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 10:28:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113.part /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113.converted {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113.part /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113.converted" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113.converted --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113.converted --force-share --output=json" returned: 0 in 0.133s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.322s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:23 user nova-compute[71283]: INFO oslo.privsep.daemon [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpx398lctu/privsep.sock'] Apr 20 10:28:23 user sudo[80222]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context nova.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpx398lctu/privsep.sock Apr 20 10:28:23 user sudo[80222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 20 10:28:25 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Successfully created port: eafcafd8-d33e-48b1-9947-cbce2d458180 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:28:25 user sudo[80222]: pam_unix(sudo:session): session closed for user root Apr 20 10:28:25 user nova-compute[71283]: INFO oslo.privsep.daemon [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Spawned new privsep daemon via rootwrap Apr 20 10:28:25 user nova-compute[71283]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 20 10:28:25 user nova-compute[71283]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 20 10:28:25 user nova-compute[71283]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none Apr 20 10:28:25 user nova-compute[71283]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80225 Apr 20 10:28:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.119s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.122s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk 1073741824" returned: 0 in 0.042s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.169s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Checking if we can resize image /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Successfully updated port: eafcafd8-d33e-48b1-9947-cbce2d458180 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Acquiring lock "refresh_cache-cf494c03-d188-49c7-879e-29d2fa555549" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Acquired lock "refresh_cache-cf494c03-d188-49c7-879e-29d2fa555549" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Cannot resize image /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.objects.instance [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lazy-loading 'migration_context' on Instance uuid cf494c03-d188-49c7-879e-29d2fa555549 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Ensure instance console log exists: /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.compute.manager [req-16fa3acd-3a3d-4ea2-996a-2676d869e429 req-6b3edb90-90ed-4634-aa88-41e29ce1d7d1 service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Received event network-changed-eafcafd8-d33e-48b1-9947-cbce2d458180 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.compute.manager [req-16fa3acd-3a3d-4ea2-996a-2676d869e429 req-6b3edb90-90ed-4634-aa88-41e29ce1d7d1 service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Refreshing instance network info cache due to event network-changed-eafcafd8-d33e-48b1-9947-cbce2d458180. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-16fa3acd-3a3d-4ea2-996a-2676d869e429 req-6b3edb90-90ed-4634-aa88-41e29ce1d7d1 service nova] Acquiring lock "refresh_cache-cf494c03-d188-49c7-879e-29d2fa555549" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Updating instance_info_cache with network_info: [{"id": "eafcafd8-d33e-48b1-9947-cbce2d458180", "address": "fa:16:3e:de:6a:d4", "network": {"id": "bd16afd3-655a-4681-9793-21eff7495aee", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1742457713-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f986df042c594f71a4db3da582def690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafcafd8-d3", "ovs_interfaceid": "eafcafd8-d33e-48b1-9947-cbce2d458180", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Releasing lock "refresh_cache-cf494c03-d188-49c7-879e-29d2fa555549" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Instance network_info: |[{"id": "eafcafd8-d33e-48b1-9947-cbce2d458180", "address": "fa:16:3e:de:6a:d4", "network": {"id": "bd16afd3-655a-4681-9793-21eff7495aee", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1742457713-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f986df042c594f71a4db3da582def690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafcafd8-d3", "ovs_interfaceid": "eafcafd8-d33e-48b1-9947-cbce2d458180", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-16fa3acd-3a3d-4ea2-996a-2676d869e429 req-6b3edb90-90ed-4634-aa88-41e29ce1d7d1 service nova] Acquired lock "refresh_cache-cf494c03-d188-49c7-879e-29d2fa555549" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.network.neutron [req-16fa3acd-3a3d-4ea2-996a-2676d869e429 req-6b3edb90-90ed-4634-aa88-41e29ce1d7d1 service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Refreshing network info cache for port eafcafd8-d33e-48b1-9947-cbce2d458180 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Start _get_guest_xml network_info=[{"id": "eafcafd8-d33e-48b1-9947-cbce2d458180", "address": "fa:16:3e:de:6a:d4", "network": {"id": "bd16afd3-655a-4681-9793-21eff7495aee", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1742457713-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f986df042c594f71a4db3da582def690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafcafd8-d3", "ovs_interfaceid": "eafcafd8-d33e-48b1-9947-cbce2d458180", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:28:26 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:26 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.privsep.utils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71283) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-294793328',display_name='tempest-DeleteServersTestJSON-server-294793328',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-294793328',id=1,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f986df042c594f71a4db3da582def690',ramdisk_id='',reservation_id='r-qdvy05xy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1091162656',owner_user_name='tempest-DeleteServersTestJSON-1091162656-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:28:23Z,user_data=None,user_id='25f64a72e9ec4ad599a5c63bec4d092e',uuid=cf494c03-d188-49c7-879e-29d2fa555549,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eafcafd8-d33e-48b1-9947-cbce2d458180", "address": "fa:16:3e:de:6a:d4", "network": {"id": "bd16afd3-655a-4681-9793-21eff7495aee", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1742457713-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f986df042c594f71a4db3da582def690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafcafd8-d3", "ovs_interfaceid": "eafcafd8-d33e-48b1-9947-cbce2d458180", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Converting VIF {"id": "eafcafd8-d33e-48b1-9947-cbce2d458180", "address": "fa:16:3e:de:6a:d4", "network": {"id": "bd16afd3-655a-4681-9793-21eff7495aee", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1742457713-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f986df042c594f71a4db3da582def690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafcafd8-d3", "ovs_interfaceid": "eafcafd8-d33e-48b1-9947-cbce2d458180", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:6a:d4,bridge_name='br-int',has_traffic_filtering=True,id=eafcafd8-d33e-48b1-9947-cbce2d458180,network=Network(bd16afd3-655a-4681-9793-21eff7495aee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafcafd8-d3') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.objects.instance [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lazy-loading 'pci_devices' on Instance uuid cf494c03-d188-49c7-879e-29d2fa555549 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] End _get_guest_xml xml= Apr 20 10:28:26 user nova-compute[71283]: cf494c03-d188-49c7-879e-29d2fa555549 Apr 20 10:28:26 user nova-compute[71283]: instance-00000001 Apr 20 10:28:26 user nova-compute[71283]: 131072 Apr 20 10:28:26 user nova-compute[71283]: 1 Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: tempest-DeleteServersTestJSON-server-294793328 Apr 20 10:28:26 user nova-compute[71283]: 2023-04-20 10:28:26 Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: 128 Apr 20 10:28:26 user nova-compute[71283]: 1 Apr 20 10:28:26 user nova-compute[71283]: 0 Apr 20 10:28:26 user nova-compute[71283]: 0 Apr 20 10:28:26 user nova-compute[71283]: 1 Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: tempest-DeleteServersTestJSON-1091162656-project-member Apr 20 10:28:26 user nova-compute[71283]: tempest-DeleteServersTestJSON-1091162656 Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: OpenStack Foundation Apr 20 10:28:26 user nova-compute[71283]: OpenStack Nova Apr 20 10:28:26 user nova-compute[71283]: 0.0.0 Apr 20 10:28:26 user nova-compute[71283]: cf494c03-d188-49c7-879e-29d2fa555549 Apr 20 10:28:26 user nova-compute[71283]: cf494c03-d188-49c7-879e-29d2fa555549 Apr 20 10:28:26 user nova-compute[71283]: Virtual Machine Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: hvm Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Nehalem Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: /dev/urandom Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: Apr 20 10:28:26 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-294793328',display_name='tempest-DeleteServersTestJSON-server-294793328',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-294793328',id=1,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f986df042c594f71a4db3da582def690',ramdisk_id='',reservation_id='r-qdvy05xy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1091162656',owner_user_name='tempest-DeleteServersTestJSON-1091162656-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:28:23Z,user_data=None,user_id='25f64a72e9ec4ad599a5c63bec4d092e',uuid=cf494c03-d188-49c7-879e-29d2fa555549,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eafcafd8-d33e-48b1-9947-cbce2d458180", "address": "fa:16:3e:de:6a:d4", "network": {"id": "bd16afd3-655a-4681-9793-21eff7495aee", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1742457713-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f986df042c594f71a4db3da582def690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafcafd8-d3", "ovs_interfaceid": "eafcafd8-d33e-48b1-9947-cbce2d458180", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Converting VIF {"id": "eafcafd8-d33e-48b1-9947-cbce2d458180", "address": "fa:16:3e:de:6a:d4", "network": {"id": "bd16afd3-655a-4681-9793-21eff7495aee", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1742457713-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f986df042c594f71a4db3da582def690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafcafd8-d3", "ovs_interfaceid": "eafcafd8-d33e-48b1-9947-cbce2d458180", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:6a:d4,bridge_name='br-int',has_traffic_filtering=True,id=eafcafd8-d33e-48b1-9947-cbce2d458180,network=Network(bd16afd3-655a-4681-9793-21eff7495aee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafcafd8-d3') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG os_vif [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:6a:d4,bridge_name='br-int',has_traffic_filtering=True,id=eafcafd8-d33e-48b1-9947-cbce2d458180,network=Network(bd16afd3-655a-4681-9793-21eff7495aee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafcafd8-d3') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Created schema index Interface.name {{(pid=71283) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Created schema index Port.name {{(pid=71283) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Created schema index Bridge.name {{(pid=71283) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] tcp:127.0.0.1:6640: entering CONNECTING {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [POLLOUT] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:27 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:27 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:27 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:28:27 user nova-compute[71283]: INFO oslo.privsep.daemon [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmprimmy6lr/privsep.sock'] Apr 20 10:28:27 user sudo[80243]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmprimmy6lr/privsep.sock Apr 20 10:28:27 user sudo[80243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 20 10:28:27 user nova-compute[71283]: DEBUG nova.network.neutron [req-16fa3acd-3a3d-4ea2-996a-2676d869e429 req-6b3edb90-90ed-4634-aa88-41e29ce1d7d1 service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Updated VIF entry in instance network info cache for port eafcafd8-d33e-48b1-9947-cbce2d458180. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:28:27 user nova-compute[71283]: DEBUG nova.network.neutron [req-16fa3acd-3a3d-4ea2-996a-2676d869e429 req-6b3edb90-90ed-4634-aa88-41e29ce1d7d1 service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Updating instance_info_cache with network_info: [{"id": "eafcafd8-d33e-48b1-9947-cbce2d458180", "address": "fa:16:3e:de:6a:d4", "network": {"id": "bd16afd3-655a-4681-9793-21eff7495aee", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1742457713-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f986df042c594f71a4db3da582def690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafcafd8-d3", "ovs_interfaceid": "eafcafd8-d33e-48b1-9947-cbce2d458180", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:28:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-16fa3acd-3a3d-4ea2-996a-2676d869e429 req-6b3edb90-90ed-4634-aa88-41e29ce1d7d1 service nova] Releasing lock "refresh_cache-cf494c03-d188-49c7-879e-29d2fa555549" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:28:27 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Acquiring lock "f48e6aa1-dd33-42a4-89c9-20691b628c70" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:28 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:28:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:28 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:28:28 user nova-compute[71283]: INFO nova.compute.claims [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Claim successful on node user Apr 20 10:28:28 user sudo[80243]: pam_unix(sudo:session): session closed for user root Apr 20 10:28:28 user nova-compute[71283]: INFO oslo.privsep.daemon [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Spawned new privsep daemon via rootwrap Apr 20 10:28:28 user nova-compute[71283]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 20 10:28:28 user nova-compute[71283]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 20 10:28:28 user nova-compute[71283]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none Apr 20 10:28:28 user nova-compute[71283]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80251 Apr 20 10:28:28 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:28:28 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:28:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.271s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:28 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:28:28 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:28:28 user nova-compute[71283]: DEBUG nova.network.neutron [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:28:28 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:28:28 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.policy [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2aef9efd33b946709d8f01f41b79f382', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2d1545b79af0497497e960cbd68aa8f2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:28:29 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Creating image(s) Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Acquiring lock "/opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "/opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "/opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeafcafd8-d3, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeafcafd8-d3, col_values=(('external_ids', {'iface-id': 'eafcafd8-d33e-48b1-9947-cbce2d458180', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:6a:d4', 'vm-uuid': 'cf494c03-d188-49c7-879e-29d2fa555549'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.144s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:29 user nova-compute[71283]: INFO os_vif [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:6a:d4,bridge_name='br-int',has_traffic_filtering=True,id=eafcafd8-d33e-48b1-9947-cbce2d458180,network=Network(bd16afd3-655a-4681-9793-21eff7495aee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafcafd8-d3') Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.129s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] No VIF found with MAC fa:16:3e:de:6a:d4, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk 1073741824" returned: 0 in 0.052s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.183s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "0f7a669c-28aa-42d2-8991-5852336c0f42" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.145s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:28:29 user nova-compute[71283]: INFO nova.compute.claims [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Claim successful on node user Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Cannot resize image /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.objects.instance [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lazy-loading 'migration_context' on Instance uuid f48e6aa1-dd33-42a4-89c9-20691b628c70 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Ensure instance console log exists: /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:28:30 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:28:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:28:30 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Creating image(s) Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "/opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "/opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "/opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.268s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG nova.policy [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '67955d2e81c04b8d8dbcbe577303e025', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1b4a2af680394ec889b4661753658b01', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Acquiring lock "64fb0d55-ef35-4386-86fe-00775b83a8d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG nova.network.neutron [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Successfully created port: 0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:28:30 user nova-compute[71283]: INFO nova.compute.claims [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Claim successful on node user Apr 20 10:28:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.165s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk 1073741824" returned: 0 in 0.070s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.238s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.141s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Checking if we can resize image /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.471s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.compute.manager [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Cannot resize image /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.objects.instance [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lazy-loading 'migration_context' on Instance uuid 0f7a669c-28aa-42d2-8991-5852336c0f42 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Ensure instance console log exists: /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.compute.manager [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.network.neutron [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:28:31 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.compute.manager [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.compute.manager [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:28:31 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Creating image(s) Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Acquiring lock "/opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "/opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "/opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG nova.policy [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bc5470a076e14c218cf04fad713ce074', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d9c5dd66fd0496a8e92b41d98fe1727', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.155s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.121s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk 1073741824" returned: 0 in 0.061s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.206s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Checking if we can resize image /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.compute.manager [req-a798dc8e-5be8-4532-9659-f944b6798947 req-0794ba74-ece7-494d-bd87-fcd711145a00 service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Received event network-vif-plugged-eafcafd8-d33e-48b1-9947-cbce2d458180 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a798dc8e-5be8-4532-9659-f944b6798947 req-0794ba74-ece7-494d-bd87-fcd711145a00 service nova] Acquiring lock "cf494c03-d188-49c7-879e-29d2fa555549-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a798dc8e-5be8-4532-9659-f944b6798947 req-0794ba74-ece7-494d-bd87-fcd711145a00 service nova] Lock "cf494c03-d188-49c7-879e-29d2fa555549-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a798dc8e-5be8-4532-9659-f944b6798947 req-0794ba74-ece7-494d-bd87-fcd711145a00 service nova] Lock "cf494c03-d188-49c7-879e-29d2fa555549-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.compute.manager [req-a798dc8e-5be8-4532-9659-f944b6798947 req-0794ba74-ece7-494d-bd87-fcd711145a00 service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] No waiting events found dispatching network-vif-plugged-eafcafd8-d33e-48b1-9947-cbce2d458180 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:28:32 user nova-compute[71283]: WARNING nova.compute.manager [req-a798dc8e-5be8-4532-9659-f944b6798947 req-0794ba74-ece7-494d-bd87-fcd711145a00 service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Received unexpected event network-vif-plugged-eafcafd8-d33e-48b1-9947-cbce2d458180 for instance with vm_state building and task_state spawning. Apr 20 10:28:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Cannot resize image /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.objects.instance [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lazy-loading 'migration_context' on Instance uuid 64fb0d55-ef35-4386-86fe-00775b83a8d4 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Ensure instance console log exists: /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:28:32 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: cf494c03-d188-49c7-879e-29d2fa555549] VM Resumed (Lifecycle Event) Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:28:32 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Instance spawned successfully. Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:28:32 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: cf494c03-d188-49c7-879e-29d2fa555549] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:28:32 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: cf494c03-d188-49c7-879e-29d2fa555549] VM Started (Lifecycle Event) Apr 20 10:28:32 user nova-compute[71283]: INFO nova.compute.manager [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Took 9.94 seconds to spawn the instance on the hypervisor. Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:28:32 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: cf494c03-d188-49c7-879e-29d2fa555549] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:28:32 user nova-compute[71283]: INFO nova.compute.manager [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Took 10.80 seconds to build instance. Apr 20 10:28:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f896f2b7-ea25-4649-8248-254cdf0ac136 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "cf494c03-d188-49c7-879e-29d2fa555549" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.996s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:33 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Successfully created port: 9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:28:33 user nova-compute[71283]: DEBUG nova.network.neutron [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Successfully updated port: 0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:28:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Acquiring lock "refresh_cache-f48e6aa1-dd33-42a4-89c9-20691b628c70" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:28:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Acquired lock "refresh_cache-f48e6aa1-dd33-42a4-89c9-20691b628c70" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:28:33 user nova-compute[71283]: DEBUG nova.network.neutron [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:28:33 user nova-compute[71283]: DEBUG nova.compute.manager [req-1ec6c1c3-7dcd-4e45-83cc-55c8cedbd23b req-11db2226-f544-4eeb-ba99-0b0ec63f0da6 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Received event network-changed-0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:33 user nova-compute[71283]: DEBUG nova.compute.manager [req-1ec6c1c3-7dcd-4e45-83cc-55c8cedbd23b req-11db2226-f544-4eeb-ba99-0b0ec63f0da6 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Refreshing instance network info cache due to event network-changed-0954983e-5cb4-4486-9816-67f4f5f78b35. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:28:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1ec6c1c3-7dcd-4e45-83cc-55c8cedbd23b req-11db2226-f544-4eeb-ba99-0b0ec63f0da6 service nova] Acquiring lock "refresh_cache-f48e6aa1-dd33-42a4-89c9-20691b628c70" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:28:33 user nova-compute[71283]: DEBUG nova.network.neutron [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.compute.manager [req-1ad79298-fe56-437b-a90f-bb086838ea9d req-2b7a6282-e4bd-445c-999e-5a9571884b3f service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Received event network-vif-plugged-eafcafd8-d33e-48b1-9947-cbce2d458180 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1ad79298-fe56-437b-a90f-bb086838ea9d req-2b7a6282-e4bd-445c-999e-5a9571884b3f service nova] Acquiring lock "cf494c03-d188-49c7-879e-29d2fa555549-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1ad79298-fe56-437b-a90f-bb086838ea9d req-2b7a6282-e4bd-445c-999e-5a9571884b3f service nova] Lock "cf494c03-d188-49c7-879e-29d2fa555549-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1ad79298-fe56-437b-a90f-bb086838ea9d req-2b7a6282-e4bd-445c-999e-5a9571884b3f service nova] Lock "cf494c03-d188-49c7-879e-29d2fa555549-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.compute.manager [req-1ad79298-fe56-437b-a90f-bb086838ea9d req-2b7a6282-e4bd-445c-999e-5a9571884b3f service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] No waiting events found dispatching network-vif-plugged-eafcafd8-d33e-48b1-9947-cbce2d458180 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:28:34 user nova-compute[71283]: WARNING nova.compute.manager [req-1ad79298-fe56-437b-a90f-bb086838ea9d req-2b7a6282-e4bd-445c-999e-5a9571884b3f service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Received unexpected event network-vif-plugged-eafcafd8-d33e-48b1-9947-cbce2d458180 for instance with vm_state active and task_state None. Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.network.neutron [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Updating instance_info_cache with network_info: [{"id": "0954983e-5cb4-4486-9816-67f4f5f78b35", "address": "fa:16:3e:dc:a8:8c", "network": {"id": "06131689-128d-43da-b721-a9dda3f2e19f", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1783672421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2d1545b79af0497497e960cbd68aa8f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0954983e-5c", "ovs_interfaceid": "0954983e-5cb4-4486-9816-67f4f5f78b35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Releasing lock "refresh_cache-f48e6aa1-dd33-42a4-89c9-20691b628c70" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Instance network_info: |[{"id": "0954983e-5cb4-4486-9816-67f4f5f78b35", "address": "fa:16:3e:dc:a8:8c", "network": {"id": "06131689-128d-43da-b721-a9dda3f2e19f", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1783672421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2d1545b79af0497497e960cbd68aa8f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0954983e-5c", "ovs_interfaceid": "0954983e-5cb4-4486-9816-67f4f5f78b35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1ec6c1c3-7dcd-4e45-83cc-55c8cedbd23b req-11db2226-f544-4eeb-ba99-0b0ec63f0da6 service nova] Acquired lock "refresh_cache-f48e6aa1-dd33-42a4-89c9-20691b628c70" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.network.neutron [req-1ec6c1c3-7dcd-4e45-83cc-55c8cedbd23b req-11db2226-f544-4eeb-ba99-0b0ec63f0da6 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Refreshing network info cache for port 0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Start _get_guest_xml network_info=[{"id": "0954983e-5cb4-4486-9816-67f4f5f78b35", "address": "fa:16:3e:dc:a8:8c", "network": {"id": "06131689-128d-43da-b721-a9dda3f2e19f", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1783672421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2d1545b79af0497497e960cbd68aa8f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0954983e-5c", "ovs_interfaceid": "0954983e-5cb4-4486-9816-67f4f5f78b35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:28:34 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:34 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1817883104',display_name='tempest-ServerStableDeviceRescueTest-server-1817883104',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-1817883104',id=2,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFPer3atuApOKAbOrxgU77C03uvd3ClFPy3FRncu+A4hBfn7+Ligr5c8zlC1kkZihqe2/A/t/d8AFp1mDAF+J+oDrvVOTMmcabgcsFyIy7gKcmM4pBHM2RndymuOvmh30Q==',key_name='tempest-keypair-314135284',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2d1545b79af0497497e960cbd68aa8f2',ramdisk_id='',reservation_id='r-nvccs3w6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1071968675',owner_user_name='tempest-ServerStableDeviceRescueTest-1071968675-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:28:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2aef9efd33b946709d8f01f41b79f382',uuid=f48e6aa1-dd33-42a4-89c9-20691b628c70,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0954983e-5cb4-4486-9816-67f4f5f78b35", "address": "fa:16:3e:dc:a8:8c", "network": {"id": "06131689-128d-43da-b721-a9dda3f2e19f", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1783672421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2d1545b79af0497497e960cbd68aa8f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0954983e-5c", "ovs_interfaceid": "0954983e-5cb4-4486-9816-67f4f5f78b35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Converting VIF {"id": "0954983e-5cb4-4486-9816-67f4f5f78b35", "address": "fa:16:3e:dc:a8:8c", "network": {"id": "06131689-128d-43da-b721-a9dda3f2e19f", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1783672421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2d1545b79af0497497e960cbd68aa8f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0954983e-5c", "ovs_interfaceid": "0954983e-5cb4-4486-9816-67f4f5f78b35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:a8:8c,bridge_name='br-int',has_traffic_filtering=True,id=0954983e-5cb4-4486-9816-67f4f5f78b35,network=Network(06131689-128d-43da-b721-a9dda3f2e19f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0954983e-5c') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.objects.instance [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lazy-loading 'pci_devices' on Instance uuid f48e6aa1-dd33-42a4-89c9-20691b628c70 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] End _get_guest_xml xml= Apr 20 10:28:34 user nova-compute[71283]: f48e6aa1-dd33-42a4-89c9-20691b628c70 Apr 20 10:28:34 user nova-compute[71283]: instance-00000002 Apr 20 10:28:34 user nova-compute[71283]: 131072 Apr 20 10:28:34 user nova-compute[71283]: 1 Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: tempest-ServerStableDeviceRescueTest-server-1817883104 Apr 20 10:28:34 user nova-compute[71283]: 2023-04-20 10:28:34 Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: 128 Apr 20 10:28:34 user nova-compute[71283]: 1 Apr 20 10:28:34 user nova-compute[71283]: 0 Apr 20 10:28:34 user nova-compute[71283]: 0 Apr 20 10:28:34 user nova-compute[71283]: 1 Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: tempest-ServerStableDeviceRescueTest-1071968675-project-member Apr 20 10:28:34 user nova-compute[71283]: tempest-ServerStableDeviceRescueTest-1071968675 Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: OpenStack Foundation Apr 20 10:28:34 user nova-compute[71283]: OpenStack Nova Apr 20 10:28:34 user nova-compute[71283]: 0.0.0 Apr 20 10:28:34 user nova-compute[71283]: f48e6aa1-dd33-42a4-89c9-20691b628c70 Apr 20 10:28:34 user nova-compute[71283]: f48e6aa1-dd33-42a4-89c9-20691b628c70 Apr 20 10:28:34 user nova-compute[71283]: Virtual Machine Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: hvm Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Nehalem Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: /dev/urandom Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: Apr 20 10:28:34 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1817883104',display_name='tempest-ServerStableDeviceRescueTest-server-1817883104',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-1817883104',id=2,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFPer3atuApOKAbOrxgU77C03uvd3ClFPy3FRncu+A4hBfn7+Ligr5c8zlC1kkZihqe2/A/t/d8AFp1mDAF+J+oDrvVOTMmcabgcsFyIy7gKcmM4pBHM2RndymuOvmh30Q==',key_name='tempest-keypair-314135284',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2d1545b79af0497497e960cbd68aa8f2',ramdisk_id='',reservation_id='r-nvccs3w6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1071968675',owner_user_name='tempest-ServerStableDeviceRescueTest-1071968675-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:28:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2aef9efd33b946709d8f01f41b79f382',uuid=f48e6aa1-dd33-42a4-89c9-20691b628c70,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0954983e-5cb4-4486-9816-67f4f5f78b35", "address": "fa:16:3e:dc:a8:8c", "network": {"id": "06131689-128d-43da-b721-a9dda3f2e19f", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1783672421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2d1545b79af0497497e960cbd68aa8f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0954983e-5c", "ovs_interfaceid": "0954983e-5cb4-4486-9816-67f4f5f78b35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Converting VIF {"id": "0954983e-5cb4-4486-9816-67f4f5f78b35", "address": "fa:16:3e:dc:a8:8c", "network": {"id": "06131689-128d-43da-b721-a9dda3f2e19f", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1783672421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2d1545b79af0497497e960cbd68aa8f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0954983e-5c", "ovs_interfaceid": "0954983e-5cb4-4486-9816-67f4f5f78b35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:a8:8c,bridge_name='br-int',has_traffic_filtering=True,id=0954983e-5cb4-4486-9816-67f4f5f78b35,network=Network(06131689-128d-43da-b721-a9dda3f2e19f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0954983e-5c') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG os_vif [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:a8:8c,bridge_name='br-int',has_traffic_filtering=True,id=0954983e-5cb4-4486-9816-67f4f5f78b35,network=Network(06131689-128d-43da-b721-a9dda3f2e19f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0954983e-5c') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0954983e-5c, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0954983e-5c, col_values=(('external_ids', {'iface-id': '0954983e-5cb4-4486-9816-67f4f5f78b35', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dc:a8:8c', 'vm-uuid': 'f48e6aa1-dd33-42a4-89c9-20691b628c70'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:34 user nova-compute[71283]: INFO os_vif [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:a8:8c,bridge_name='br-int',has_traffic_filtering=True,id=0954983e-5cb4-4486-9816-67f4f5f78b35,network=Network(06131689-128d-43da-b721-a9dda3f2e19f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0954983e-5c') Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] No VIF found with MAC fa:16:3e:dc:a8:8c, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "9189f862-2e91-4420-ab64-54375c4f9466" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "9189f862-2e91-4420-ab64-54375c4f9466" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:28:34 user nova-compute[71283]: DEBUG nova.network.neutron [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Successfully created port: 851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:28:35 user nova-compute[71283]: INFO nova.compute.claims [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Claim successful on node user Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Successfully updated port: 9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.433s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "refresh_cache-0f7a669c-28aa-42d2-8991-5852336c0f42" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquired lock "refresh_cache-0f7a669c-28aa-42d2-8991-5852336c0f42" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.network.neutron [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:28:35 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.compute.manager [req-e6a10af7-cbd1-47e9-ba96-d255190a98eb req-9a789ce7-f254-417e-b04f-d86b832fe56b service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Received event network-changed-9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.compute.manager [req-e6a10af7-cbd1-47e9-ba96-d255190a98eb req-9a789ce7-f254-417e-b04f-d86b832fe56b service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Refreshing instance network info cache due to event network-changed-9bd8e190-3b17-470b-9274-5060f7875bba. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-e6a10af7-cbd1-47e9-ba96-d255190a98eb req-9a789ce7-f254-417e-b04f-d86b832fe56b service nova] Acquiring lock "refresh_cache-0f7a669c-28aa-42d2-8991-5852336c0f42" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.network.neutron [req-1ec6c1c3-7dcd-4e45-83cc-55c8cedbd23b req-11db2226-f544-4eeb-ba99-0b0ec63f0da6 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Updated VIF entry in instance network info cache for port 0954983e-5cb4-4486-9816-67f4f5f78b35. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.network.neutron [req-1ec6c1c3-7dcd-4e45-83cc-55c8cedbd23b req-11db2226-f544-4eeb-ba99-0b0ec63f0da6 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Updating instance_info_cache with network_info: [{"id": "0954983e-5cb4-4486-9816-67f4f5f78b35", "address": "fa:16:3e:dc:a8:8c", "network": {"id": "06131689-128d-43da-b721-a9dda3f2e19f", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1783672421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2d1545b79af0497497e960cbd68aa8f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0954983e-5c", "ovs_interfaceid": "0954983e-5cb4-4486-9816-67f4f5f78b35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1ec6c1c3-7dcd-4e45-83cc-55c8cedbd23b req-11db2226-f544-4eeb-ba99-0b0ec63f0da6 service nova] Releasing lock "refresh_cache-f48e6aa1-dd33-42a4-89c9-20691b628c70" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:28:35 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Creating image(s) Apr 20 10:28:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "/opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "/opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "/opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:35 user nova-compute[71283]: DEBUG nova.policy [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '712be6d6876f4b2c9d796e406a43f8bf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7c3ecc1463ec42eea56f2890b032ef7a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.161s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.158s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk 1073741824" returned: 0 in 0.050s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.212s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.154s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Checking if we can resize image /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json" returned: 0 in 0.167s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Cannot resize image /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.objects.instance [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lazy-loading 'migration_context' on Instance uuid 9189f862-2e91-4420-ab64-54375c4f9466 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Updating instance_info_cache with network_info: [{"id": "9bd8e190-3b17-470b-9274-5060f7875bba", "address": "fa:16:3e:78:4f:ac", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bd8e190-3b", "ovs_interfaceid": "9bd8e190-3b17-470b-9274-5060f7875bba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.compute.manager [req-4b20af73-7bed-4984-978b-6301af10acf4 req-2f96a1fd-fa7b-462d-a4d3-4e08ba93f66e service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Received event network-vif-plugged-0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4b20af73-7bed-4984-978b-6301af10acf4 req-2f96a1fd-fa7b-462d-a4d3-4e08ba93f66e service nova] Acquiring lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4b20af73-7bed-4984-978b-6301af10acf4 req-2f96a1fd-fa7b-462d-a4d3-4e08ba93f66e service nova] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4b20af73-7bed-4984-978b-6301af10acf4 req-2f96a1fd-fa7b-462d-a4d3-4e08ba93f66e service nova] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.compute.manager [req-4b20af73-7bed-4984-978b-6301af10acf4 req-2f96a1fd-fa7b-462d-a4d3-4e08ba93f66e service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] No waiting events found dispatching network-vif-plugged-0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:28:36 user nova-compute[71283]: WARNING nova.compute.manager [req-4b20af73-7bed-4984-978b-6301af10acf4 req-2f96a1fd-fa7b-462d-a4d3-4e08ba93f66e service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Received unexpected event network-vif-plugged-0954983e-5cb4-4486-9816-67f4f5f78b35 for instance with vm_state building and task_state spawning. Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Ensure instance console log exists: /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Releasing lock "refresh_cache-0f7a669c-28aa-42d2-8991-5852336c0f42" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Instance network_info: |[{"id": "9bd8e190-3b17-470b-9274-5060f7875bba", "address": "fa:16:3e:78:4f:ac", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bd8e190-3b", "ovs_interfaceid": "9bd8e190-3b17-470b-9274-5060f7875bba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-e6a10af7-cbd1-47e9-ba96-d255190a98eb req-9a789ce7-f254-417e-b04f-d86b832fe56b service nova] Acquired lock "refresh_cache-0f7a669c-28aa-42d2-8991-5852336c0f42" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.network.neutron [req-e6a10af7-cbd1-47e9-ba96-d255190a98eb req-9a789ce7-f254-417e-b04f-d86b832fe56b service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Refreshing network info cache for port 9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Start _get_guest_xml network_info=[{"id": "9bd8e190-3b17-470b-9274-5060f7875bba", "address": "fa:16:3e:78:4f:ac", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bd8e190-3b", "ovs_interfaceid": "9bd8e190-3b17-470b-9274-5060f7875bba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:28:36 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:36 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-268849146',display_name='tempest-AttachVolumeTestJSON-server-268849146',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-268849146',id=3,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGAh9xOp+18xwLGeRalB9BZJ/w8VTQdLWJWLt/GaIgucfc2qbhwsBZgwV0eAZLo1CJ17o1m08+BmrJtQvN99qzggaLadMbVfpQ2nwEGkvh5hKNizDvlBcY5GFmW0Xmz1BQ==',key_name='tempest-keypair-2096466478',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1b4a2af680394ec889b4661753658b01',ramdisk_id='',reservation_id='r-du438i0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1715839687',owner_user_name='tempest-AttachVolumeTestJSON-1715839687-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:28:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='67955d2e81c04b8d8dbcbe577303e025',uuid=0f7a669c-28aa-42d2-8991-5852336c0f42,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bd8e190-3b17-470b-9274-5060f7875bba", "address": "fa:16:3e:78:4f:ac", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bd8e190-3b", "ovs_interfaceid": "9bd8e190-3b17-470b-9274-5060f7875bba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Converting VIF {"id": "9bd8e190-3b17-470b-9274-5060f7875bba", "address": "fa:16:3e:78:4f:ac", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bd8e190-3b", "ovs_interfaceid": "9bd8e190-3b17-470b-9274-5060f7875bba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:4f:ac,bridge_name='br-int',has_traffic_filtering=True,id=9bd8e190-3b17-470b-9274-5060f7875bba,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bd8e190-3b') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.objects.instance [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lazy-loading 'pci_devices' on Instance uuid 0f7a669c-28aa-42d2-8991-5852336c0f42 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] End _get_guest_xml xml= Apr 20 10:28:36 user nova-compute[71283]: 0f7a669c-28aa-42d2-8991-5852336c0f42 Apr 20 10:28:36 user nova-compute[71283]: instance-00000003 Apr 20 10:28:36 user nova-compute[71283]: 131072 Apr 20 10:28:36 user nova-compute[71283]: 1 Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: tempest-AttachVolumeTestJSON-server-268849146 Apr 20 10:28:36 user nova-compute[71283]: 2023-04-20 10:28:36 Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: 128 Apr 20 10:28:36 user nova-compute[71283]: 1 Apr 20 10:28:36 user nova-compute[71283]: 0 Apr 20 10:28:36 user nova-compute[71283]: 0 Apr 20 10:28:36 user nova-compute[71283]: 1 Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: tempest-AttachVolumeTestJSON-1715839687-project-member Apr 20 10:28:36 user nova-compute[71283]: tempest-AttachVolumeTestJSON-1715839687 Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: OpenStack Foundation Apr 20 10:28:36 user nova-compute[71283]: OpenStack Nova Apr 20 10:28:36 user nova-compute[71283]: 0.0.0 Apr 20 10:28:36 user nova-compute[71283]: 0f7a669c-28aa-42d2-8991-5852336c0f42 Apr 20 10:28:36 user nova-compute[71283]: 0f7a669c-28aa-42d2-8991-5852336c0f42 Apr 20 10:28:36 user nova-compute[71283]: Virtual Machine Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: hvm Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Nehalem Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: /dev/urandom Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: Apr 20 10:28:36 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-268849146',display_name='tempest-AttachVolumeTestJSON-server-268849146',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-268849146',id=3,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGAh9xOp+18xwLGeRalB9BZJ/w8VTQdLWJWLt/GaIgucfc2qbhwsBZgwV0eAZLo1CJ17o1m08+BmrJtQvN99qzggaLadMbVfpQ2nwEGkvh5hKNizDvlBcY5GFmW0Xmz1BQ==',key_name='tempest-keypair-2096466478',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1b4a2af680394ec889b4661753658b01',ramdisk_id='',reservation_id='r-du438i0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1715839687',owner_user_name='tempest-AttachVolumeTestJSON-1715839687-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:28:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='67955d2e81c04b8d8dbcbe577303e025',uuid=0f7a669c-28aa-42d2-8991-5852336c0f42,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bd8e190-3b17-470b-9274-5060f7875bba", "address": "fa:16:3e:78:4f:ac", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bd8e190-3b", "ovs_interfaceid": "9bd8e190-3b17-470b-9274-5060f7875bba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Converting VIF {"id": "9bd8e190-3b17-470b-9274-5060f7875bba", "address": "fa:16:3e:78:4f:ac", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bd8e190-3b", "ovs_interfaceid": "9bd8e190-3b17-470b-9274-5060f7875bba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:4f:ac,bridge_name='br-int',has_traffic_filtering=True,id=9bd8e190-3b17-470b-9274-5060f7875bba,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bd8e190-3b') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG os_vif [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:4f:ac,bridge_name='br-int',has_traffic_filtering=True,id=9bd8e190-3b17-470b-9274-5060f7875bba,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bd8e190-3b') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bd8e190-3b, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9bd8e190-3b, col_values=(('external_ids', {'iface-id': '9bd8e190-3b17-470b-9274-5060f7875bba', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:4f:ac', 'vm-uuid': '0f7a669c-28aa-42d2-8991-5852336c0f42'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:36 user nova-compute[71283]: INFO os_vif [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:4f:ac,bridge_name='br-int',has_traffic_filtering=True,id=9bd8e190-3b17-470b-9274-5060f7875bba,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bd8e190-3b') Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:28:36 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] No VIF found with MAC fa:16:3e:78:4f:ac, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:28:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.network.neutron [req-e6a10af7-cbd1-47e9-ba96-d255190a98eb req-9a789ce7-f254-417e-b04f-d86b832fe56b service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Updated VIF entry in instance network info cache for port 9bd8e190-3b17-470b-9274-5060f7875bba. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.network.neutron [req-e6a10af7-cbd1-47e9-ba96-d255190a98eb req-9a789ce7-f254-417e-b04f-d86b832fe56b service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Updating instance_info_cache with network_info: [{"id": "9bd8e190-3b17-470b-9274-5060f7875bba", "address": "fa:16:3e:78:4f:ac", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bd8e190-3b", "ovs_interfaceid": "9bd8e190-3b17-470b-9274-5060f7875bba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-e6a10af7-cbd1-47e9-ba96-d255190a98eb req-9a789ce7-f254-417e-b04f-d86b832fe56b service nova] Releasing lock "refresh_cache-0f7a669c-28aa-42d2-8991-5852336c0f42" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.network.neutron [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Successfully created port: ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:28:38 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] VM Resumed (Lifecycle Event) Apr 20 10:28:38 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Instance spawned successfully. Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:38 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:28:38 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] VM Started (Lifecycle Event) Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.compute.manager [req-406b7ccb-e5df-419c-b25a-9cb4453ae72d req-3ddf4514-321f-4803-b8e1-e40f8de9f295 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Received event network-vif-plugged-0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-406b7ccb-e5df-419c-b25a-9cb4453ae72d req-3ddf4514-321f-4803-b8e1-e40f8de9f295 service nova] Acquiring lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-406b7ccb-e5df-419c-b25a-9cb4453ae72d req-3ddf4514-321f-4803-b8e1-e40f8de9f295 service nova] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-406b7ccb-e5df-419c-b25a-9cb4453ae72d req-3ddf4514-321f-4803-b8e1-e40f8de9f295 service nova] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.compute.manager [req-406b7ccb-e5df-419c-b25a-9cb4453ae72d req-3ddf4514-321f-4803-b8e1-e40f8de9f295 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] No waiting events found dispatching network-vif-plugged-0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:28:38 user nova-compute[71283]: WARNING nova.compute.manager [req-406b7ccb-e5df-419c-b25a-9cb4453ae72d req-3ddf4514-321f-4803-b8e1-e40f8de9f295 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Received unexpected event network-vif-plugged-0954983e-5cb4-4486-9816-67f4f5f78b35 for instance with vm_state building and task_state spawning. Apr 20 10:28:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:28:38 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:28:38 user nova-compute[71283]: INFO nova.compute.manager [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Took 9.62 seconds to spawn the instance on the hypervisor. Apr 20 10:28:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:38 user nova-compute[71283]: INFO nova.compute.manager [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Took 10.29 seconds to build instance. Apr 20 10:28:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1fc55f6a-1391-41bd-95c5-5cce87886a29 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.427s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG nova.network.neutron [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Successfully updated port: 851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Acquiring lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Acquired lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG nova.network.neutron [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG nova.network.neutron [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Successfully updated port: ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG nova.network.neutron [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "refresh_cache-9189f862-2e91-4420-ab64-54375c4f9466" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquired lock "refresh_cache-9189f862-2e91-4420-ab64-54375c4f9466" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG nova.network.neutron [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG nova.compute.manager [req-8cf55133-4ebf-47b1-98f3-b986c52c3acf req-408559e4-d0fc-454b-add7-4d3cf0ccd8d1 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received event network-changed-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG nova.compute.manager [req-8cf55133-4ebf-47b1-98f3-b986c52c3acf req-408559e4-d0fc-454b-add7-4d3cf0ccd8d1 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Refreshing instance network info cache due to event network-changed-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8cf55133-4ebf-47b1-98f3-b986c52c3acf req-408559e4-d0fc-454b-add7-4d3cf0ccd8d1 service nova] Acquiring lock "refresh_cache-9189f862-2e91-4420-ab64-54375c4f9466" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG nova.network.neutron [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG nova.compute.manager [req-3883f404-128d-44ba-986c-9bbcb093188d req-70da1afa-7d5d-43ee-9793-106cafd1a3d0 service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Received event network-changed-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG nova.compute.manager [req-3883f404-128d-44ba-986c-9bbcb093188d req-70da1afa-7d5d-43ee-9793-106cafd1a3d0 service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Refreshing instance network info cache due to event network-changed-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-3883f404-128d-44ba-986c-9bbcb093188d req-70da1afa-7d5d-43ee-9793-106cafd1a3d0 service nova] Acquiring lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.network.neutron [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Updating instance_info_cache with network_info: [{"id": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "address": "fa:16:3e:39:2f:ad", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1a6c29-6d", "ovs_interfaceid": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Releasing lock "refresh_cache-9189f862-2e91-4420-ab64-54375c4f9466" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Instance network_info: |[{"id": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "address": "fa:16:3e:39:2f:ad", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1a6c29-6d", "ovs_interfaceid": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8cf55133-4ebf-47b1-98f3-b986c52c3acf req-408559e4-d0fc-454b-add7-4d3cf0ccd8d1 service nova] Acquired lock "refresh_cache-9189f862-2e91-4420-ab64-54375c4f9466" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.network.neutron [req-8cf55133-4ebf-47b1-98f3-b986c52c3acf req-408559e4-d0fc-454b-add7-4d3cf0ccd8d1 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Refreshing network info cache for port ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Start _get_guest_xml network_info=[{"id": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "address": "fa:16:3e:39:2f:ad", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1a6c29-6d", "ovs_interfaceid": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:28:40 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:40 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-42507989',display_name='tempest-VolumesAdminNegativeTest-server-42507989',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-42507989',id=5,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKo6j6WbCBGkhY+/90E7m8dSLZQNugKiG3Ze4NFSIzNr6U7x488jnMXjls3Or9lOdNpk237JBBGIDaL4rNwHM9ccpiAVRM3eozuQ2DPkyUdqXkWu62GoK7gs5ZQh/CHghQ==',key_name='tempest-keypair-1241550354',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7c3ecc1463ec42eea56f2890b032ef7a',ramdisk_id='',reservation_id='r-5fxeobjd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-2134667240',owner_user_name='tempest-VolumesAdminNegativeTest-2134667240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:28:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='712be6d6876f4b2c9d796e406a43f8bf',uuid=9189f862-2e91-4420-ab64-54375c4f9466,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "address": "fa:16:3e:39:2f:ad", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1a6c29-6d", "ovs_interfaceid": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Converting VIF {"id": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "address": "fa:16:3e:39:2f:ad", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1a6c29-6d", "ovs_interfaceid": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:2f:ad,bridge_name='br-int',has_traffic_filtering=True,id=ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1a6c29-6d') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.objects.instance [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lazy-loading 'pci_devices' on Instance uuid 9189f862-2e91-4420-ab64-54375c4f9466 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] End _get_guest_xml xml= Apr 20 10:28:40 user nova-compute[71283]: 9189f862-2e91-4420-ab64-54375c4f9466 Apr 20 10:28:40 user nova-compute[71283]: instance-00000005 Apr 20 10:28:40 user nova-compute[71283]: 131072 Apr 20 10:28:40 user nova-compute[71283]: 1 Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: tempest-VolumesAdminNegativeTest-server-42507989 Apr 20 10:28:40 user nova-compute[71283]: 2023-04-20 10:28:40 Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: 128 Apr 20 10:28:40 user nova-compute[71283]: 1 Apr 20 10:28:40 user nova-compute[71283]: 0 Apr 20 10:28:40 user nova-compute[71283]: 0 Apr 20 10:28:40 user nova-compute[71283]: 1 Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: tempest-VolumesAdminNegativeTest-2134667240-project-member Apr 20 10:28:40 user nova-compute[71283]: tempest-VolumesAdminNegativeTest-2134667240 Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: OpenStack Foundation Apr 20 10:28:40 user nova-compute[71283]: OpenStack Nova Apr 20 10:28:40 user nova-compute[71283]: 0.0.0 Apr 20 10:28:40 user nova-compute[71283]: 9189f862-2e91-4420-ab64-54375c4f9466 Apr 20 10:28:40 user nova-compute[71283]: 9189f862-2e91-4420-ab64-54375c4f9466 Apr 20 10:28:40 user nova-compute[71283]: Virtual Machine Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: hvm Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Nehalem Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: /dev/urandom Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-42507989',display_name='tempest-VolumesAdminNegativeTest-server-42507989',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-42507989',id=5,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKo6j6WbCBGkhY+/90E7m8dSLZQNugKiG3Ze4NFSIzNr6U7x488jnMXjls3Or9lOdNpk237JBBGIDaL4rNwHM9ccpiAVRM3eozuQ2DPkyUdqXkWu62GoK7gs5ZQh/CHghQ==',key_name='tempest-keypair-1241550354',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7c3ecc1463ec42eea56f2890b032ef7a',ramdisk_id='',reservation_id='r-5fxeobjd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-2134667240',owner_user_name='tempest-VolumesAdminNegativeTest-2134667240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:28:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='712be6d6876f4b2c9d796e406a43f8bf',uuid=9189f862-2e91-4420-ab64-54375c4f9466,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "address": "fa:16:3e:39:2f:ad", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1a6c29-6d", "ovs_interfaceid": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Converting VIF {"id": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "address": "fa:16:3e:39:2f:ad", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1a6c29-6d", "ovs_interfaceid": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:2f:ad,bridge_name='br-int',has_traffic_filtering=True,id=ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1a6c29-6d') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG os_vif [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:2f:ad,bridge_name='br-int',has_traffic_filtering=True,id=ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1a6c29-6d') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab1a6c29-6d, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapab1a6c29-6d, col_values=(('external_ids', {'iface-id': 'ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:2f:ad', 'vm-uuid': '9189f862-2e91-4420-ab64-54375c4f9466'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:40 user nova-compute[71283]: INFO os_vif [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:2f:ad,bridge_name='br-int',has_traffic_filtering=True,id=ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1a6c29-6d') Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] No VIF found with MAC fa:16:3e:39:2f:ad, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:28:40 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] VM Resumed (Lifecycle Event) Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:28:40 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Instance spawned successfully. Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:40 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:28:40 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] VM Started (Lifecycle Event) Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:28:40 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:28:40 user nova-compute[71283]: INFO nova.compute.manager [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Took 10.50 seconds to spawn the instance on the hypervisor. Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.network.neutron [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Updating instance_info_cache with network_info: [{"id": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "address": "fa:16:3e:fe:b8:27", "network": {"id": "e68379ef-0781-4af6-be68-a311cdded61e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-509902278-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8d9c5dd66fd0496a8e92b41d98fe1727", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap851e6a9d-9d", "ovs_interfaceid": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Releasing lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.compute.manager [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Instance network_info: |[{"id": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "address": "fa:16:3e:fe:b8:27", "network": {"id": "e68379ef-0781-4af6-be68-a311cdded61e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-509902278-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8d9c5dd66fd0496a8e92b41d98fe1727", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap851e6a9d-9d", "ovs_interfaceid": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-3883f404-128d-44ba-986c-9bbcb093188d req-70da1afa-7d5d-43ee-9793-106cafd1a3d0 service nova] Acquired lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.network.neutron [req-3883f404-128d-44ba-986c-9bbcb093188d req-70da1afa-7d5d-43ee-9793-106cafd1a3d0 service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Refreshing network info cache for port 851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Start _get_guest_xml network_info=[{"id": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "address": "fa:16:3e:fe:b8:27", "network": {"id": "e68379ef-0781-4af6-be68-a311cdded61e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-509902278-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8d9c5dd66fd0496a8e92b41d98fe1727", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap851e6a9d-9d", "ovs_interfaceid": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:28:40 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:40 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1446443267',display_name='tempest-ServerActionsTestJSON-server-1446443267',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-1446443267',id=4,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJHK0F1a8yp8Xvld+jR/l8lxpwDFkBJKvOkeodCIMdNIsy3rZb2cz1b4Lbds32NZ3fZXZBlAND7bKaMCernVFVPVOsJFwYkZamLCbeIJtO4ahR/NTi6fJvZoUyyvCYQG3g==',key_name='tempest-keypair-748576161',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d9c5dd66fd0496a8e92b41d98fe1727',ramdisk_id='',reservation_id='r-01ns16zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-692026271',owner_user_name='tempest-ServerActionsTestJSON-692026271-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:28:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc5470a076e14c218cf04fad713ce074',uuid=64fb0d55-ef35-4386-86fe-00775b83a8d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "address": "fa:16:3e:fe:b8:27", "network": {"id": "e68379ef-0781-4af6-be68-a311cdded61e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-509902278-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8d9c5dd66fd0496a8e92b41d98fe1727", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap851e6a9d-9d", "ovs_interfaceid": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Converting VIF {"id": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "address": "fa:16:3e:fe:b8:27", "network": {"id": "e68379ef-0781-4af6-be68-a311cdded61e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-509902278-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8d9c5dd66fd0496a8e92b41d98fe1727", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap851e6a9d-9d", "ovs_interfaceid": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:b8:27,bridge_name='br-int',has_traffic_filtering=True,id=851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1,network=Network(e68379ef-0781-4af6-be68-a311cdded61e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap851e6a9d-9d') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.objects.instance [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lazy-loading 'pci_devices' on Instance uuid 64fb0d55-ef35-4386-86fe-00775b83a8d4 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] End _get_guest_xml xml= Apr 20 10:28:40 user nova-compute[71283]: 64fb0d55-ef35-4386-86fe-00775b83a8d4 Apr 20 10:28:40 user nova-compute[71283]: instance-00000004 Apr 20 10:28:40 user nova-compute[71283]: 131072 Apr 20 10:28:40 user nova-compute[71283]: 1 Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: tempest-ServerActionsTestJSON-server-1446443267 Apr 20 10:28:40 user nova-compute[71283]: 2023-04-20 10:28:40 Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: 128 Apr 20 10:28:40 user nova-compute[71283]: 1 Apr 20 10:28:40 user nova-compute[71283]: 0 Apr 20 10:28:40 user nova-compute[71283]: 0 Apr 20 10:28:40 user nova-compute[71283]: 1 Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: tempest-ServerActionsTestJSON-692026271-project-member Apr 20 10:28:40 user nova-compute[71283]: tempest-ServerActionsTestJSON-692026271 Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: OpenStack Foundation Apr 20 10:28:40 user nova-compute[71283]: OpenStack Nova Apr 20 10:28:40 user nova-compute[71283]: 0.0.0 Apr 20 10:28:40 user nova-compute[71283]: 64fb0d55-ef35-4386-86fe-00775b83a8d4 Apr 20 10:28:40 user nova-compute[71283]: 64fb0d55-ef35-4386-86fe-00775b83a8d4 Apr 20 10:28:40 user nova-compute[71283]: Virtual Machine Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: hvm Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Nehalem Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: /dev/urandom Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: Apr 20 10:28:40 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1446443267',display_name='tempest-ServerActionsTestJSON-server-1446443267',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-1446443267',id=4,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJHK0F1a8yp8Xvld+jR/l8lxpwDFkBJKvOkeodCIMdNIsy3rZb2cz1b4Lbds32NZ3fZXZBlAND7bKaMCernVFVPVOsJFwYkZamLCbeIJtO4ahR/NTi6fJvZoUyyvCYQG3g==',key_name='tempest-keypair-748576161',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d9c5dd66fd0496a8e92b41d98fe1727',ramdisk_id='',reservation_id='r-01ns16zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-692026271',owner_user_name='tempest-ServerActionsTestJSON-692026271-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:28:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc5470a076e14c218cf04fad713ce074',uuid=64fb0d55-ef35-4386-86fe-00775b83a8d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "address": "fa:16:3e:fe:b8:27", "network": {"id": "e68379ef-0781-4af6-be68-a311cdded61e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-509902278-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8d9c5dd66fd0496a8e92b41d98fe1727", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap851e6a9d-9d", "ovs_interfaceid": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Converting VIF {"id": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "address": "fa:16:3e:fe:b8:27", "network": {"id": "e68379ef-0781-4af6-be68-a311cdded61e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-509902278-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8d9c5dd66fd0496a8e92b41d98fe1727", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap851e6a9d-9d", "ovs_interfaceid": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:b8:27,bridge_name='br-int',has_traffic_filtering=True,id=851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1,network=Network(e68379ef-0781-4af6-be68-a311cdded61e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap851e6a9d-9d') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG os_vif [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:b8:27,bridge_name='br-int',has_traffic_filtering=True,id=851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1,network=Network(e68379ef-0781-4af6-be68-a311cdded61e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap851e6a9d-9d') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap851e6a9d-9d, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap851e6a9d-9d, col_values=(('external_ids', {'iface-id': '851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fe:b8:27', 'vm-uuid': '64fb0d55-ef35-4386-86fe-00775b83a8d4'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:40 user nova-compute[71283]: INFO os_vif [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:b8:27,bridge_name='br-int',has_traffic_filtering=True,id=851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1,network=Network(e68379ef-0781-4af6-be68-a311cdded61e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap851e6a9d-9d') Apr 20 10:28:40 user nova-compute[71283]: INFO nova.compute.manager [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Took 11.32 seconds to build instance. Apr 20 10:28:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f1e00853-89b1-4f49-8378-ca1cb6aeb820 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.452s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:41 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:28:41 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] No VIF found with MAC fa:16:3e:fe:b8:27, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:28:41 user nova-compute[71283]: DEBUG nova.network.neutron [req-8cf55133-4ebf-47b1-98f3-b986c52c3acf req-408559e4-d0fc-454b-add7-4d3cf0ccd8d1 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Updated VIF entry in instance network info cache for port ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:28:41 user nova-compute[71283]: DEBUG nova.network.neutron [req-8cf55133-4ebf-47b1-98f3-b986c52c3acf req-408559e4-d0fc-454b-add7-4d3cf0ccd8d1 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Updating instance_info_cache with network_info: [{"id": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "address": "fa:16:3e:39:2f:ad", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1a6c29-6d", "ovs_interfaceid": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:28:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8cf55133-4ebf-47b1-98f3-b986c52c3acf req-408559e4-d0fc-454b-add7-4d3cf0ccd8d1 service nova] Releasing lock "refresh_cache-9189f862-2e91-4420-ab64-54375c4f9466" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:28:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:41 user nova-compute[71283]: DEBUG nova.compute.manager [req-dbb96b87-7c81-4e70-a69b-94dc1afe23b7 req-6dad13cd-9a05-4313-8ef6-d9c8026ae141 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Received event network-vif-plugged-9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dbb96b87-7c81-4e70-a69b-94dc1afe23b7 req-6dad13cd-9a05-4313-8ef6-d9c8026ae141 service nova] Acquiring lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dbb96b87-7c81-4e70-a69b-94dc1afe23b7 req-6dad13cd-9a05-4313-8ef6-d9c8026ae141 service nova] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dbb96b87-7c81-4e70-a69b-94dc1afe23b7 req-6dad13cd-9a05-4313-8ef6-d9c8026ae141 service nova] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:41 user nova-compute[71283]: DEBUG nova.compute.manager [req-dbb96b87-7c81-4e70-a69b-94dc1afe23b7 req-6dad13cd-9a05-4313-8ef6-d9c8026ae141 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] No waiting events found dispatching network-vif-plugged-9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:28:41 user nova-compute[71283]: WARNING nova.compute.manager [req-dbb96b87-7c81-4e70-a69b-94dc1afe23b7 req-6dad13cd-9a05-4313-8ef6-d9c8026ae141 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Received unexpected event network-vif-plugged-9bd8e190-3b17-470b-9274-5060f7875bba for instance with vm_state active and task_state None. Apr 20 10:28:42 user nova-compute[71283]: DEBUG nova.network.neutron [req-3883f404-128d-44ba-986c-9bbcb093188d req-70da1afa-7d5d-43ee-9793-106cafd1a3d0 service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Updated VIF entry in instance network info cache for port 851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:28:42 user nova-compute[71283]: DEBUG nova.network.neutron [req-3883f404-128d-44ba-986c-9bbcb093188d req-70da1afa-7d5d-43ee-9793-106cafd1a3d0 service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Updating instance_info_cache with network_info: [{"id": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "address": "fa:16:3e:fe:b8:27", "network": {"id": "e68379ef-0781-4af6-be68-a311cdded61e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-509902278-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8d9c5dd66fd0496a8e92b41d98fe1727", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap851e6a9d-9d", "ovs_interfaceid": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:28:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-3883f404-128d-44ba-986c-9bbcb093188d req-70da1afa-7d5d-43ee-9793-106cafd1a3d0 service nova] Releasing lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:28:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG nova.compute.manager [req-1a82308e-456d-4b2e-81c2-57c974346635 req-2c4b2dae-81d4-4e70-9d3c-35043d473b01 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Received event network-vif-plugged-9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1a82308e-456d-4b2e-81c2-57c974346635 req-2c4b2dae-81d4-4e70-9d3c-35043d473b01 service nova] Acquiring lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1a82308e-456d-4b2e-81c2-57c974346635 req-2c4b2dae-81d4-4e70-9d3c-35043d473b01 service nova] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1a82308e-456d-4b2e-81c2-57c974346635 req-2c4b2dae-81d4-4e70-9d3c-35043d473b01 service nova] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG nova.compute.manager [req-1a82308e-456d-4b2e-81c2-57c974346635 req-2c4b2dae-81d4-4e70-9d3c-35043d473b01 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] No waiting events found dispatching network-vif-plugged-9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:28:44 user nova-compute[71283]: WARNING nova.compute.manager [req-1a82308e-456d-4b2e-81c2-57c974346635 req-2c4b2dae-81d4-4e70-9d3c-35043d473b01 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Received unexpected event network-vif-plugged-9bd8e190-3b17-470b-9274-5060f7875bba for instance with vm_state active and task_state None. Apr 20 10:28:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG nova.compute.manager [req-eb74b60a-6525-4167-9fb1-c65d87f67c61 req-5f7eb7f1-0ecd-43b9-b100-2fda2eb1d7d7 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received event network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-eb74b60a-6525-4167-9fb1-c65d87f67c61 req-5f7eb7f1-0ecd-43b9-b100-2fda2eb1d7d7 service nova] Acquiring lock "9189f862-2e91-4420-ab64-54375c4f9466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-eb74b60a-6525-4167-9fb1-c65d87f67c61 req-5f7eb7f1-0ecd-43b9-b100-2fda2eb1d7d7 service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-eb74b60a-6525-4167-9fb1-c65d87f67c61 req-5f7eb7f1-0ecd-43b9-b100-2fda2eb1d7d7 service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG nova.compute.manager [req-eb74b60a-6525-4167-9fb1-c65d87f67c61 req-5f7eb7f1-0ecd-43b9-b100-2fda2eb1d7d7 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] No waiting events found dispatching network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:28:44 user nova-compute[71283]: WARNING nova.compute.manager [req-eb74b60a-6525-4167-9fb1-c65d87f67c61 req-5f7eb7f1-0ecd-43b9-b100-2fda2eb1d7d7 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received unexpected event network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd for instance with vm_state building and task_state spawning. Apr 20 10:28:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG nova.compute.manager [req-a716b319-6043-4ac4-aa5a-623b670693b6 req-2af4135d-e6a6-4b50-9c84-e1ef74d036fa service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Received event network-vif-plugged-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a716b319-6043-4ac4-aa5a-623b670693b6 req-2af4135d-e6a6-4b50-9c84-e1ef74d036fa service nova] Acquiring lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a716b319-6043-4ac4-aa5a-623b670693b6 req-2af4135d-e6a6-4b50-9c84-e1ef74d036fa service nova] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a716b319-6043-4ac4-aa5a-623b670693b6 req-2af4135d-e6a6-4b50-9c84-e1ef74d036fa service nova] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:44 user nova-compute[71283]: DEBUG nova.compute.manager [req-a716b319-6043-4ac4-aa5a-623b670693b6 req-2af4135d-e6a6-4b50-9c84-e1ef74d036fa service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] No waiting events found dispatching network-vif-plugged-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:28:44 user nova-compute[71283]: WARNING nova.compute.manager [req-a716b319-6043-4ac4-aa5a-623b670693b6 req-2af4135d-e6a6-4b50-9c84-e1ef74d036fa service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Received unexpected event network-vif-plugged-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 for instance with vm_state building and task_state spawning. Apr 20 10:28:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.compute.manager [req-67ffd02b-4bf1-4e8b-8bde-52e293236ea0 req-2bcf662e-ef20-40f9-b2b4-5a9befc1fa14 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received event network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-67ffd02b-4bf1-4e8b-8bde-52e293236ea0 req-2bcf662e-ef20-40f9-b2b4-5a9befc1fa14 service nova] Acquiring lock "9189f862-2e91-4420-ab64-54375c4f9466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-67ffd02b-4bf1-4e8b-8bde-52e293236ea0 req-2bcf662e-ef20-40f9-b2b4-5a9befc1fa14 service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-67ffd02b-4bf1-4e8b-8bde-52e293236ea0 req-2bcf662e-ef20-40f9-b2b4-5a9befc1fa14 service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.compute.manager [req-67ffd02b-4bf1-4e8b-8bde-52e293236ea0 req-2bcf662e-ef20-40f9-b2b4-5a9befc1fa14 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] No waiting events found dispatching network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:28:46 user nova-compute[71283]: WARNING nova.compute.manager [req-67ffd02b-4bf1-4e8b-8bde-52e293236ea0 req-2bcf662e-ef20-40f9-b2b4-5a9befc1fa14 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received unexpected event network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd for instance with vm_state building and task_state spawning. Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:28:46 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] VM Resumed (Lifecycle Event) Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.compute.manager [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:28:46 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Instance spawned successfully. Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:46 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Instance spawned successfully. Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:46 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:46 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:28:47 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] VM Started (Lifecycle Event) Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.compute.manager [req-6719b81a-1ecf-45bf-b87f-d25f4d6971b6 req-20621a90-4563-48e9-89d6-9eef44a872aa service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Received event network-vif-plugged-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-6719b81a-1ecf-45bf-b87f-d25f4d6971b6 req-20621a90-4563-48e9-89d6-9eef44a872aa service nova] Acquiring lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-6719b81a-1ecf-45bf-b87f-d25f4d6971b6 req-20621a90-4563-48e9-89d6-9eef44a872aa service nova] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-6719b81a-1ecf-45bf-b87f-d25f4d6971b6 req-20621a90-4563-48e9-89d6-9eef44a872aa service nova] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.compute.manager [req-6719b81a-1ecf-45bf-b87f-d25f4d6971b6 req-20621a90-4563-48e9-89d6-9eef44a872aa service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] No waiting events found dispatching network-vif-plugged-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:28:47 user nova-compute[71283]: WARNING nova.compute.manager [req-6719b81a-1ecf-45bf-b87f-d25f4d6971b6 req-20621a90-4563-48e9-89d6-9eef44a872aa service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Received unexpected event network-vif-plugged-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 for instance with vm_state building and task_state spawning. Apr 20 10:28:47 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:28:47 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] VM Resumed (Lifecycle Event) Apr 20 10:28:47 user nova-compute[71283]: INFO nova.compute.manager [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Took 11.33 seconds to spawn the instance on the hypervisor. Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:47 user nova-compute[71283]: INFO nova.compute.manager [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Took 15.64 seconds to spawn the instance on the hypervisor. Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:28:47 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:28:47 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] VM Started (Lifecycle Event) Apr 20 10:28:47 user nova-compute[71283]: INFO nova.compute.manager [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Took 12.39 seconds to build instance. Apr 20 10:28:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e21bbb5c-11c2-440c-b2f2-682cadd8bff7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "9189f862-2e91-4420-ab64-54375c4f9466" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.503s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:28:47 user nova-compute[71283]: INFO nova.compute.manager [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Took 16.69 seconds to build instance. Apr 20 10:28:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-856f1ea8-6212-485f-8afc-ca7fad99b597 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.809s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:50 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Acquiring lock "bf7d300f-b748-446b-97ce-6cae12609e7a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:50 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:50 user nova-compute[71283]: DEBUG nova.compute.manager [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:28:50 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:50 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:50 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:28:50 user nova-compute[71283]: INFO nova.compute.claims [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Claim successful on node user Apr 20 10:28:50 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:51 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:28:51 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:28:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.459s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:51 user nova-compute[71283]: DEBUG nova.compute.manager [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:28:51 user nova-compute[71283]: DEBUG nova.compute.manager [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:28:51 user nova-compute[71283]: DEBUG nova.network.neutron [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:28:51 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names Apr 20 10:28:51 user nova-compute[71283]: DEBUG nova.compute.manager [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:28:51 user nova-compute[71283]: DEBUG nova.compute.manager [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:28:51 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:28:51 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Creating image(s) Apr 20 10:28:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Acquiring lock "/opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "/opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "/opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Acquiring lock "e372bccc070cc79e355c735e75048a4a0cd2e5f6" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "e372bccc070cc79e355c735e75048a4a0cd2e5f6" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.003s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:51 user nova-compute[71283]: DEBUG nova.policy [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6e5685756fe74751ad4c4fe85c30c951', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cb0f8979fdc4456db83cc88103856510', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6.part --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6.part --force-share --output=json" returned: 0 in 0.174s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG nova.virt.images [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] 9e8cd77e-f94e-4043-a953-9fb6fa201c58 was qcow2, converting to raw {{(pid=71283) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG nova.privsep.utils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71283) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6.part /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6.converted {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6.part /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6.converted" returned: 0 in 0.166s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6.converted --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6.converted --force-share --output=json" returned: 0 in 0.190s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "e372bccc070cc79e355c735e75048a4a0cd2e5f6" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.032s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6 --force-share --output=json" returned: 0 in 0.150s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Acquiring lock "e372bccc070cc79e355c735e75048a4a0cd2e5f6" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "e372bccc070cc79e355c735e75048a4a0cd2e5f6" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.005s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6 --force-share --output=json" returned: 0 in 0.159s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6,backing_fmt=raw /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6,backing_fmt=raw /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk 1073741824" returned: 0 in 0.072s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "e372bccc070cc79e355c735e75048a4a0cd2e5f6" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.241s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG nova.network.neutron [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Successfully created port: ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e372bccc070cc79e355c735e75048a4a0cd2e5f6 --force-share --output=json" returned: 0 in 0.149s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Checking if we can resize image /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk --force-share --output=json" returned: 0 in 0.168s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Cannot resize image /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG nova.objects.instance [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lazy-loading 'migration_context' on Instance uuid bf7d300f-b748-446b-97ce-6cae12609e7a {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Ensure instance console log exists: /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.network.neutron [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Successfully updated port: ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Acquiring lock "refresh_cache-bf7d300f-b748-446b-97ce-6cae12609e7a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Acquired lock "refresh_cache-bf7d300f-b748-446b-97ce-6cae12609e7a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.network.neutron [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.compute.manager [req-48f93e6e-8a50-4800-afbb-cd09737d823b req-b796dd4f-c5cf-40b3-bb6f-7ec65c5ad735 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received event network-changed-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.compute.manager [req-48f93e6e-8a50-4800-afbb-cd09737d823b req-b796dd4f-c5cf-40b3-bb6f-7ec65c5ad735 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Refreshing instance network info cache due to event network-changed-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-48f93e6e-8a50-4800-afbb-cd09737d823b req-b796dd4f-c5cf-40b3-bb6f-7ec65c5ad735 service nova] Acquiring lock "refresh_cache-bf7d300f-b748-446b-97ce-6cae12609e7a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.network.neutron [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.network.neutron [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Updating instance_info_cache with network_info: [{"id": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "address": "fa:16:3e:5d:9f:5d", "network": {"id": "5a79f1a7-4d58-42fc-9c84-6f9951485092", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1428889344-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0f8979fdc4456db83cc88103856510", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapffe07547-7e", "ovs_interfaceid": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Releasing lock "refresh_cache-bf7d300f-b748-446b-97ce-6cae12609e7a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.compute.manager [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Instance network_info: |[{"id": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "address": "fa:16:3e:5d:9f:5d", "network": {"id": "5a79f1a7-4d58-42fc-9c84-6f9951485092", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1428889344-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0f8979fdc4456db83cc88103856510", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapffe07547-7e", "ovs_interfaceid": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-48f93e6e-8a50-4800-afbb-cd09737d823b req-b796dd4f-c5cf-40b3-bb6f-7ec65c5ad735 service nova] Acquired lock "refresh_cache-bf7d300f-b748-446b-97ce-6cae12609e7a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.network.neutron [req-48f93e6e-8a50-4800-afbb-cd09737d823b req-b796dd4f-c5cf-40b3-bb6f-7ec65c5ad735 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Refreshing network info cache for port ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Start _get_guest_xml network_info=[{"id": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "address": "fa:16:3e:5d:9f:5d", "network": {"id": "5a79f1a7-4d58-42fc-9c84-6f9951485092", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1428889344-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0f8979fdc4456db83cc88103856510", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapffe07547-7e", "ovs_interfaceid": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:28:44Z,direct_url=,disk_format='qcow2',id=9e8cd77e-f94e-4043-a953-9fb6fa201c58,min_disk=0,min_ram=0,name='',owner='2bc21b2f5d93440aaf2b410ada266f85',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:28:46Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/sda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'scsi', 'encrypted': False, 'image_id': '9e8cd77e-f94e-4043-a953-9fb6fa201c58'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:28:54 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:54 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:28:44Z,direct_url=,disk_format='qcow2',id=9e8cd77e-f94e-4043-a953-9fb6fa201c58,min_disk=0,min_ram=0,name='',owner='2bc21b2f5d93440aaf2b410ada266f85',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:28:46Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:28:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-20T10:28:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1435500858',display_name='tempest-AttachSCSIVolumeTestJSON-server-1435500858',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1435500858',id=6,image_ref='9e8cd77e-f94e-4043-a953-9fb6fa201c58',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbO3gWjR+kfpqealBy4YcF5JnglKkdA7HMnKPq7F8n83ZuToSao0nhpqddYS4a7HYIvtQt1kPTWohrL7u4lMFSRJRaAHMC0ISjFRgdgwi8OOridVoioARVP9cUAseOLWA==',key_name='tempest-keypair-1802426226',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cb0f8979fdc4456db83cc88103856510',ramdisk_id='',reservation_id='r-8knp4xne',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e8cd77e-f94e-4043-a953-9fb6fa201c58',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1542709130',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1542709130-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:28:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6e5685756fe74751ad4c4fe85c30c951',uuid=bf7d300f-b748-446b-97ce-6cae12609e7a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "address": "fa:16:3e:5d:9f:5d", "network": {"id": "5a79f1a7-4d58-42fc-9c84-6f9951485092", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1428889344-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0f8979fdc4456db83cc88103856510", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapffe07547-7e", "ovs_interfaceid": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Converting VIF {"id": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "address": "fa:16:3e:5d:9f:5d", "network": {"id": "5a79f1a7-4d58-42fc-9c84-6f9951485092", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1428889344-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0f8979fdc4456db83cc88103856510", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapffe07547-7e", "ovs_interfaceid": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9f:5d,bridge_name='br-int',has_traffic_filtering=True,id=ffe07547-7e1b-4f9e-afa9-31ccb5dca61d,network=Network(5a79f1a7-4d58-42fc-9c84-6f9951485092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffe07547-7e') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG nova.objects.instance [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lazy-loading 'pci_devices' on Instance uuid bf7d300f-b748-446b-97ce-6cae12609e7a {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] End _get_guest_xml xml= Apr 20 10:28:55 user nova-compute[71283]: bf7d300f-b748-446b-97ce-6cae12609e7a Apr 20 10:28:55 user nova-compute[71283]: instance-00000006 Apr 20 10:28:55 user nova-compute[71283]: 131072 Apr 20 10:28:55 user nova-compute[71283]: 1 Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: tempest-AttachSCSIVolumeTestJSON-server-1435500858 Apr 20 10:28:55 user nova-compute[71283]: 2023-04-20 10:28:54 Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: 128 Apr 20 10:28:55 user nova-compute[71283]: 1 Apr 20 10:28:55 user nova-compute[71283]: 0 Apr 20 10:28:55 user nova-compute[71283]: 0 Apr 20 10:28:55 user nova-compute[71283]: 1 Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: tempest-AttachSCSIVolumeTestJSON-1542709130-project-member Apr 20 10:28:55 user nova-compute[71283]: tempest-AttachSCSIVolumeTestJSON-1542709130 Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: OpenStack Foundation Apr 20 10:28:55 user nova-compute[71283]: OpenStack Nova Apr 20 10:28:55 user nova-compute[71283]: 0.0.0 Apr 20 10:28:55 user nova-compute[71283]: bf7d300f-b748-446b-97ce-6cae12609e7a Apr 20 10:28:55 user nova-compute[71283]: bf7d300f-b748-446b-97ce-6cae12609e7a Apr 20 10:28:55 user nova-compute[71283]: Virtual Machine Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: hvm Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Nehalem Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]:
Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]:
Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: /dev/urandom Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: Apr 20 10:28:55 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-20T10:28:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1435500858',display_name='tempest-AttachSCSIVolumeTestJSON-server-1435500858',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1435500858',id=6,image_ref='9e8cd77e-f94e-4043-a953-9fb6fa201c58',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbO3gWjR+kfpqealBy4YcF5JnglKkdA7HMnKPq7F8n83ZuToSao0nhpqddYS4a7HYIvtQt1kPTWohrL7u4lMFSRJRaAHMC0ISjFRgdgwi8OOridVoioARVP9cUAseOLWA==',key_name='tempest-keypair-1802426226',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cb0f8979fdc4456db83cc88103856510',ramdisk_id='',reservation_id='r-8knp4xne',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e8cd77e-f94e-4043-a953-9fb6fa201c58',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1542709130',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1542709130-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:28:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6e5685756fe74751ad4c4fe85c30c951',uuid=bf7d300f-b748-446b-97ce-6cae12609e7a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "address": "fa:16:3e:5d:9f:5d", "network": {"id": "5a79f1a7-4d58-42fc-9c84-6f9951485092", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1428889344-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0f8979fdc4456db83cc88103856510", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapffe07547-7e", "ovs_interfaceid": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Converting VIF {"id": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "address": "fa:16:3e:5d:9f:5d", "network": {"id": "5a79f1a7-4d58-42fc-9c84-6f9951485092", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1428889344-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0f8979fdc4456db83cc88103856510", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapffe07547-7e", "ovs_interfaceid": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9f:5d,bridge_name='br-int',has_traffic_filtering=True,id=ffe07547-7e1b-4f9e-afa9-31ccb5dca61d,network=Network(5a79f1a7-4d58-42fc-9c84-6f9951485092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffe07547-7e') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG os_vif [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9f:5d,bridge_name='br-int',has_traffic_filtering=True,id=ffe07547-7e1b-4f9e-afa9-31ccb5dca61d,network=Network(5a79f1a7-4d58-42fc-9c84-6f9951485092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffe07547-7e') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapffe07547-7e, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapffe07547-7e, col_values=(('external_ids', {'iface-id': 'ffe07547-7e1b-4f9e-afa9-31ccb5dca61d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5d:9f:5d', 'vm-uuid': 'bf7d300f-b748-446b-97ce-6cae12609e7a'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:55 user nova-compute[71283]: INFO os_vif [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:9f:5d,bridge_name='br-int',has_traffic_filtering=True,id=ffe07547-7e1b-4f9e-afa9-31ccb5dca61d,network=Network(5a79f1a7-4d58-42fc-9c84-6f9951485092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffe07547-7e') Apr 20 10:28:55 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] No BDM found with device name sda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] No BDM found with device name sdb, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] No VIF found with MAC fa:16:3e:5d:9f:5d, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:28:55 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Using config drive Apr 20 10:28:55 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Creating config drive at /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk.config Apr 20 10:28:55 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpyo49nokq {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpyo49nokq" returned: 0 in 0.051s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG nova.network.neutron [req-48f93e6e-8a50-4800-afbb-cd09737d823b req-b796dd4f-c5cf-40b3-bb6f-7ec65c5ad735 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Updated VIF entry in instance network info cache for port ffe07547-7e1b-4f9e-afa9-31ccb5dca61d. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG nova.network.neutron [req-48f93e6e-8a50-4800-afbb-cd09737d823b req-b796dd4f-c5cf-40b3-bb6f-7ec65c5ad735 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Updating instance_info_cache with network_info: [{"id": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "address": "fa:16:3e:5d:9f:5d", "network": {"id": "5a79f1a7-4d58-42fc-9c84-6f9951485092", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1428889344-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0f8979fdc4456db83cc88103856510", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapffe07547-7e", "ovs_interfaceid": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:28:55 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-48f93e6e-8a50-4800-afbb-cd09737d823b req-b796dd4f-c5cf-40b3-bb6f-7ec65c5ad735 service nova] Releasing lock "refresh_cache-bf7d300f-b748-446b-97ce-6cae12609e7a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:28:56 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:56 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:56 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:56 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:57 user nova-compute[71283]: DEBUG nova.compute.manager [req-b85258e9-2666-469d-8601-4aac511a8a5b req-ead47a37-a628-4435-8c03-66a576ea87df service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received event network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b85258e9-2666-469d-8601-4aac511a8a5b req-ead47a37-a628-4435-8c03-66a576ea87df service nova] Acquiring lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b85258e9-2666-469d-8601-4aac511a8a5b req-ead47a37-a628-4435-8c03-66a576ea87df service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b85258e9-2666-469d-8601-4aac511a8a5b req-ead47a37-a628-4435-8c03-66a576ea87df service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:57 user nova-compute[71283]: DEBUG nova.compute.manager [req-b85258e9-2666-469d-8601-4aac511a8a5b req-ead47a37-a628-4435-8c03-66a576ea87df service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] No waiting events found dispatching network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:28:57 user nova-compute[71283]: WARNING nova.compute.manager [req-b85258e9-2666-469d-8601-4aac511a8a5b req-ead47a37-a628-4435-8c03-66a576ea87df service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received unexpected event network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d for instance with vm_state building and task_state spawning. Apr 20 10:28:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:28:59 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] VM Resumed (Lifecycle Event) Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:59 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Instance spawned successfully. Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.compute.manager [req-6e1ef1c2-2246-46e1-88b6-c0b7bad6100b req-40bc311d-c9af-4cea-ad07-572ccd578cbd service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received event network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-6e1ef1c2-2246-46e1-88b6-c0b7bad6100b req-40bc311d-c9af-4cea-ad07-572ccd578cbd service nova] Acquiring lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-6e1ef1c2-2246-46e1-88b6-c0b7bad6100b req-40bc311d-c9af-4cea-ad07-572ccd578cbd service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-6e1ef1c2-2246-46e1-88b6-c0b7bad6100b req-40bc311d-c9af-4cea-ad07-572ccd578cbd service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.compute.manager [req-6e1ef1c2-2246-46e1-88b6-c0b7bad6100b req-40bc311d-c9af-4cea-ad07-572ccd578cbd service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] No waiting events found dispatching network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:28:59 user nova-compute[71283]: WARNING nova.compute.manager [req-6e1ef1c2-2246-46e1-88b6-c0b7bad6100b req-40bc311d-c9af-4cea-ad07-572ccd578cbd service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received unexpected event network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d for instance with vm_state building and task_state spawning. Apr 20 10:28:59 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:28:59 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] VM Started (Lifecycle Event) Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:28:59 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:28:59 user nova-compute[71283]: INFO nova.compute.manager [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Took 7.63 seconds to spawn the instance on the hypervisor. Apr 20 10:28:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:28:59 user nova-compute[71283]: INFO nova.compute.manager [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Took 8.49 seconds to build instance. Apr 20 10:28:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-82ea6831-a543-4805-b9d6-1e593bd502c6 tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.625s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:28:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:05 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:29:08 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 20 10:29:08 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] There are 0 instances to clean {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 20 10:29:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:29:08 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances with incomplete migration {{(pid=71283) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 20 10:29:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:29:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:29:10 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:10 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:29:10 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:29:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Acquiring lock "23940921-698b-4e96-8aed-2d1c8c12299f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:29:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "23940921-698b-4e96-8aed-2d1c8c12299f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:29:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:29:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:29:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:29:10 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:29:10 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:29:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:29:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:29:10 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:29:10 user nova-compute[71283]: INFO nova.compute.claims [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Claim successful on node user Apr 20 10:29:10 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.443s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG nova.network.neutron [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:29:11 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:29:11 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:29:11 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Creating image(s) Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Acquiring lock "/opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "/opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "/opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG nova.policy [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8355ed1609024bc1a30cbc422d8f90c2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3d65a1127ac04ac8976d9fbb08197248', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:11 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/disk 1073741824" returned: 0 in 0.077s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.217s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.148s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Checking if we can resize image /opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Cannot resize image /opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG nova.objects.instance [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lazy-loading 'migration_context' on Instance uuid 23940921-698b-4e96-8aed-2d1c8c12299f {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Ensure instance console log exists: /opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG nova.network.neutron [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Successfully created port: 9c8c1513-213f-430e-8807-4172515e6170 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:29:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:29:13 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:29:13 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=8216MB free_disk=26.549392700195312GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance cf494c03-d188-49c7-879e-29d2fa555549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance f48e6aa1-dd33-42a4-89c9-20691b628c70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 0f7a669c-28aa-42d2-8991-5852336c0f42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 64fb0d55-ef35-4386-86fe-00775b83a8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 9189f862-2e91-4420-ab64-54375c4f9466 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance bf7d300f-b748-446b-97ce-6cae12609e7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 23940921-698b-4e96-8aed-2d1c8c12299f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 7 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=1408MB phys_disk=40GB used_disk=7GB total_vcpus=12 used_vcpus=7 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.353s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.network.neutron [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Successfully updated port: 9c8c1513-213f-430e-8807-4172515e6170 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Acquiring lock "refresh_cache-23940921-698b-4e96-8aed-2d1c8c12299f" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Acquired lock "refresh_cache-23940921-698b-4e96-8aed-2d1c8c12299f" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.network.neutron [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-4544b413-ead5-4531-bb98-6692c2594e85 req-74faeb87-3137-4aef-8818-6daf7908bb5f service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Received event network-changed-9c8c1513-213f-430e-8807-4172515e6170 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-4544b413-ead5-4531-bb98-6692c2594e85 req-74faeb87-3137-4aef-8818-6daf7908bb5f service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Refreshing instance network info cache due to event network-changed-9c8c1513-213f-430e-8807-4172515e6170. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4544b413-ead5-4531-bb98-6692c2594e85 req-74faeb87-3137-4aef-8818-6daf7908bb5f service nova] Acquiring lock "refresh_cache-23940921-698b-4e96-8aed-2d1c8c12299f" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:29:13 user nova-compute[71283]: DEBUG nova.network.neutron [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.network.neutron [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Updating instance_info_cache with network_info: [{"id": "9c8c1513-213f-430e-8807-4172515e6170", "address": "fa:16:3e:82:4d:f7", "network": {"id": "324b6a85-3443-483c-bd8e-4a128c9daf02", "bridge": "br-int", "label": "tempest-VolumesActionsTest-627525662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "3d65a1127ac04ac8976d9fbb08197248", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c8c1513-21", "ovs_interfaceid": "9c8c1513-213f-430e-8807-4172515e6170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Releasing lock "refresh_cache-23940921-698b-4e96-8aed-2d1c8c12299f" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Instance network_info: |[{"id": "9c8c1513-213f-430e-8807-4172515e6170", "address": "fa:16:3e:82:4d:f7", "network": {"id": "324b6a85-3443-483c-bd8e-4a128c9daf02", "bridge": "br-int", "label": "tempest-VolumesActionsTest-627525662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "3d65a1127ac04ac8976d9fbb08197248", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c8c1513-21", "ovs_interfaceid": "9c8c1513-213f-430e-8807-4172515e6170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4544b413-ead5-4531-bb98-6692c2594e85 req-74faeb87-3137-4aef-8818-6daf7908bb5f service nova] Acquired lock "refresh_cache-23940921-698b-4e96-8aed-2d1c8c12299f" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.network.neutron [req-4544b413-ead5-4531-bb98-6692c2594e85 req-74faeb87-3137-4aef-8818-6daf7908bb5f service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Refreshing network info cache for port 9c8c1513-213f-430e-8807-4172515e6170 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Start _get_guest_xml network_info=[{"id": "9c8c1513-213f-430e-8807-4172515e6170", "address": "fa:16:3e:82:4d:f7", "network": {"id": "324b6a85-3443-483c-bd8e-4a128c9daf02", "bridge": "br-int", "label": "tempest-VolumesActionsTest-627525662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "3d65a1127ac04ac8976d9fbb08197248", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c8c1513-21", "ovs_interfaceid": "9c8c1513-213f-430e-8807-4172515e6170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:29:14 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:29:14 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:29:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-304888933',display_name='tempest-VolumesActionsTest-instance-304888933',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-304888933',id=7,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3d65a1127ac04ac8976d9fbb08197248',ramdisk_id='',reservation_id='r-o86xj3x6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-640305762',owner_user_name='tempest-VolumesActionsTest-640305762-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:29:12Z,user_data=None,user_id='8355ed1609024bc1a30cbc422d8f90c2',uuid=23940921-698b-4e96-8aed-2d1c8c12299f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9c8c1513-213f-430e-8807-4172515e6170", "address": "fa:16:3e:82:4d:f7", "network": {"id": "324b6a85-3443-483c-bd8e-4a128c9daf02", "bridge": "br-int", "label": "tempest-VolumesActionsTest-627525662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "3d65a1127ac04ac8976d9fbb08197248", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c8c1513-21", "ovs_interfaceid": "9c8c1513-213f-430e-8807-4172515e6170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Converting VIF {"id": "9c8c1513-213f-430e-8807-4172515e6170", "address": "fa:16:3e:82:4d:f7", "network": {"id": "324b6a85-3443-483c-bd8e-4a128c9daf02", "bridge": "br-int", "label": "tempest-VolumesActionsTest-627525662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "3d65a1127ac04ac8976d9fbb08197248", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c8c1513-21", "ovs_interfaceid": "9c8c1513-213f-430e-8807-4172515e6170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:4d:f7,bridge_name='br-int',has_traffic_filtering=True,id=9c8c1513-213f-430e-8807-4172515e6170,network=Network(324b6a85-3443-483c-bd8e-4a128c9daf02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c8c1513-21') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.objects.instance [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lazy-loading 'pci_devices' on Instance uuid 23940921-698b-4e96-8aed-2d1c8c12299f {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] End _get_guest_xml xml= Apr 20 10:29:14 user nova-compute[71283]: 23940921-698b-4e96-8aed-2d1c8c12299f Apr 20 10:29:14 user nova-compute[71283]: instance-00000007 Apr 20 10:29:14 user nova-compute[71283]: 131072 Apr 20 10:29:14 user nova-compute[71283]: 1 Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: tempest-VolumesActionsTest-instance-304888933 Apr 20 10:29:14 user nova-compute[71283]: 2023-04-20 10:29:14 Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: 128 Apr 20 10:29:14 user nova-compute[71283]: 1 Apr 20 10:29:14 user nova-compute[71283]: 0 Apr 20 10:29:14 user nova-compute[71283]: 0 Apr 20 10:29:14 user nova-compute[71283]: 1 Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: tempest-VolumesActionsTest-640305762-project-member Apr 20 10:29:14 user nova-compute[71283]: tempest-VolumesActionsTest-640305762 Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: OpenStack Foundation Apr 20 10:29:14 user nova-compute[71283]: OpenStack Nova Apr 20 10:29:14 user nova-compute[71283]: 0.0.0 Apr 20 10:29:14 user nova-compute[71283]: 23940921-698b-4e96-8aed-2d1c8c12299f Apr 20 10:29:14 user nova-compute[71283]: 23940921-698b-4e96-8aed-2d1c8c12299f Apr 20 10:29:14 user nova-compute[71283]: Virtual Machine Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: hvm Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Nehalem Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: /dev/urandom Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: Apr 20 10:29:14 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:29:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-304888933',display_name='tempest-VolumesActionsTest-instance-304888933',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-304888933',id=7,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3d65a1127ac04ac8976d9fbb08197248',ramdisk_id='',reservation_id='r-o86xj3x6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-640305762',owner_user_name='tempest-VolumesActionsTest-640305762-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:29:12Z,user_data=None,user_id='8355ed1609024bc1a30cbc422d8f90c2',uuid=23940921-698b-4e96-8aed-2d1c8c12299f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9c8c1513-213f-430e-8807-4172515e6170", "address": "fa:16:3e:82:4d:f7", "network": {"id": "324b6a85-3443-483c-bd8e-4a128c9daf02", "bridge": "br-int", "label": "tempest-VolumesActionsTest-627525662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "3d65a1127ac04ac8976d9fbb08197248", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c8c1513-21", "ovs_interfaceid": "9c8c1513-213f-430e-8807-4172515e6170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Converting VIF {"id": "9c8c1513-213f-430e-8807-4172515e6170", "address": "fa:16:3e:82:4d:f7", "network": {"id": "324b6a85-3443-483c-bd8e-4a128c9daf02", "bridge": "br-int", "label": "tempest-VolumesActionsTest-627525662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "3d65a1127ac04ac8976d9fbb08197248", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c8c1513-21", "ovs_interfaceid": "9c8c1513-213f-430e-8807-4172515e6170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:4d:f7,bridge_name='br-int',has_traffic_filtering=True,id=9c8c1513-213f-430e-8807-4172515e6170,network=Network(324b6a85-3443-483c-bd8e-4a128c9daf02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c8c1513-21') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG os_vif [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:4d:f7,bridge_name='br-int',has_traffic_filtering=True,id=9c8c1513-213f-430e-8807-4172515e6170,network=Network(324b6a85-3443-483c-bd8e-4a128c9daf02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c8c1513-21') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c8c1513-21, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9c8c1513-21, col_values=(('external_ids', {'iface-id': '9c8c1513-213f-430e-8807-4172515e6170', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:4d:f7', 'vm-uuid': '23940921-698b-4e96-8aed-2d1c8c12299f'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:14 user nova-compute[71283]: INFO os_vif [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:4d:f7,bridge_name='br-int',has_traffic_filtering=True,id=9c8c1513-213f-430e-8807-4172515e6170,network=Network(324b6a85-3443-483c-bd8e-4a128c9daf02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c8c1513-21') Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] No VIF found with MAC fa:16:3e:82:4d:f7, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.network.neutron [req-4544b413-ead5-4531-bb98-6692c2594e85 req-74faeb87-3137-4aef-8818-6daf7908bb5f service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Updated VIF entry in instance network info cache for port 9c8c1513-213f-430e-8807-4172515e6170. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.network.neutron [req-4544b413-ead5-4531-bb98-6692c2594e85 req-74faeb87-3137-4aef-8818-6daf7908bb5f service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Updating instance_info_cache with network_info: [{"id": "9c8c1513-213f-430e-8807-4172515e6170", "address": "fa:16:3e:82:4d:f7", "network": {"id": "324b6a85-3443-483c-bd8e-4a128c9daf02", "bridge": "br-int", "label": "tempest-VolumesActionsTest-627525662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "3d65a1127ac04ac8976d9fbb08197248", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c8c1513-21", "ovs_interfaceid": "9c8c1513-213f-430e-8807-4172515e6170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Skipping network cache update for instance because it is Building. {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9805}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4544b413-ead5-4531-bb98-6692c2594e85 req-74faeb87-3137-4aef-8818-6daf7908bb5f service nova] Releasing lock "refresh_cache-23940921-698b-4e96-8aed-2d1c8c12299f" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-cf494c03-d188-49c7-879e-29d2fa555549" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-cf494c03-d188-49c7-879e-29d2fa555549" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:29:14 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid cf494c03-d188-49c7-879e-29d2fa555549 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Updating instance_info_cache with network_info: [{"id": "eafcafd8-d33e-48b1-9947-cbce2d458180", "address": "fa:16:3e:de:6a:d4", "network": {"id": "bd16afd3-655a-4681-9793-21eff7495aee", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1742457713-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f986df042c594f71a4db3da582def690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafcafd8-d3", "ovs_interfaceid": "eafcafd8-d33e-48b1-9947-cbce2d458180", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-cf494c03-d188-49c7-879e-29d2fa555549" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG nova.compute.manager [req-410b5371-47dc-4bdf-8c88-d7a3773b54bf req-4710d69b-8990-43b2-8937-44325338b31a service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Received event network-vif-plugged-9c8c1513-213f-430e-8807-4172515e6170 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-410b5371-47dc-4bdf-8c88-d7a3773b54bf req-4710d69b-8990-43b2-8937-44325338b31a service nova] Acquiring lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-410b5371-47dc-4bdf-8c88-d7a3773b54bf req-4710d69b-8990-43b2-8937-44325338b31a service nova] Lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-410b5371-47dc-4bdf-8c88-d7a3773b54bf req-4710d69b-8990-43b2-8937-44325338b31a service nova] Lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:29:15 user nova-compute[71283]: DEBUG nova.compute.manager [req-410b5371-47dc-4bdf-8c88-d7a3773b54bf req-4710d69b-8990-43b2-8937-44325338b31a service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] No waiting events found dispatching network-vif-plugged-9c8c1513-213f-430e-8807-4172515e6170 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:29:15 user nova-compute[71283]: WARNING nova.compute.manager [req-410b5371-47dc-4bdf-8c88-d7a3773b54bf req-4710d69b-8990-43b2-8937-44325338b31a service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Received unexpected event network-vif-plugged-9c8c1513-213f-430e-8807-4172515e6170 for instance with vm_state building and task_state spawning. Apr 20 10:29:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:29:17 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] VM Resumed (Lifecycle Event) Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:29:17 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Instance spawned successfully. Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:29:17 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:29:17 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] VM Started (Lifecycle Event) Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:29:17 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:29:17 user nova-compute[71283]: INFO nova.compute.manager [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Took 6.03 seconds to spawn the instance on the hypervisor. Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:29:17 user nova-compute[71283]: INFO nova.compute.manager [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Took 6.88 seconds to build instance. Apr 20 10:29:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d4197410-cb17-4a9a-adaa-d7a2d2de32ec tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "23940921-698b-4e96-8aed-2d1c8c12299f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.016s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.compute.manager [req-785f0761-41e6-40f1-bccb-a7524e919294 req-1ae5b5ce-f721-4921-990f-a23ed7f8c90b service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Received event network-vif-plugged-9c8c1513-213f-430e-8807-4172515e6170 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-785f0761-41e6-40f1-bccb-a7524e919294 req-1ae5b5ce-f721-4921-990f-a23ed7f8c90b service nova] Acquiring lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-785f0761-41e6-40f1-bccb-a7524e919294 req-1ae5b5ce-f721-4921-990f-a23ed7f8c90b service nova] Lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-785f0761-41e6-40f1-bccb-a7524e919294 req-1ae5b5ce-f721-4921-990f-a23ed7f8c90b service nova] Lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:29:17 user nova-compute[71283]: DEBUG nova.compute.manager [req-785f0761-41e6-40f1-bccb-a7524e919294 req-1ae5b5ce-f721-4921-990f-a23ed7f8c90b service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] No waiting events found dispatching network-vif-plugged-9c8c1513-213f-430e-8807-4172515e6170 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:29:17 user nova-compute[71283]: WARNING nova.compute.manager [req-785f0761-41e6-40f1-bccb-a7524e919294 req-1ae5b5ce-f721-4921-990f-a23ed7f8c90b service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Received unexpected event network-vif-plugged-9c8c1513-213f-430e-8807-4172515e6170 for instance with vm_state active and task_state None. Apr 20 10:29:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:29:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:29:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:29:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:29:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:29:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:29:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:11 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:30:11 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:30:12 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:30:12 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:30:12 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:12 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:12 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:12 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:30:12 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Acquiring lock "cf494c03-d188-49c7-879e-29d2fa555549" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "cf494c03-d188-49c7-879e-29d2fa555549" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.003s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Acquiring lock "cf494c03-d188-49c7-879e-29d2fa555549-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "cf494c03-d188-49c7-879e-29d2fa555549-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "cf494c03-d188-49c7-879e-29d2fa555549-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:13 user nova-compute[71283]: INFO nova.compute.manager [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Terminating instance Apr 20 10:30:13 user nova-compute[71283]: DEBUG nova.compute.manager [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a/disk --force-share --output=json" returned: 0 in 0.158s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-88ee509d-45d9-466e-83b8-e369a5c19878 req-8990f45c-259d-496d-bc53-1663d617070b service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Received event network-vif-unplugged-eafcafd8-d33e-48b1-9947-cbce2d458180 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-88ee509d-45d9-466e-83b8-e369a5c19878 req-8990f45c-259d-496d-bc53-1663d617070b service nova] Acquiring lock "cf494c03-d188-49c7-879e-29d2fa555549-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-88ee509d-45d9-466e-83b8-e369a5c19878 req-8990f45c-259d-496d-bc53-1663d617070b service nova] Lock "cf494c03-d188-49c7-879e-29d2fa555549-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-88ee509d-45d9-466e-83b8-e369a5c19878 req-8990f45c-259d-496d-bc53-1663d617070b service nova] Lock "cf494c03-d188-49c7-879e-29d2fa555549-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-88ee509d-45d9-466e-83b8-e369a5c19878 req-8990f45c-259d-496d-bc53-1663d617070b service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] No waiting events found dispatching network-vif-unplugged-eafcafd8-d33e-48b1-9947-cbce2d458180 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-88ee509d-45d9-466e-83b8-e369a5c19878 req-8990f45c-259d-496d-bc53-1663d617070b service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Received event network-vif-unplugged-eafcafd8-d33e-48b1-9947-cbce2d458180 for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/disk --force-share --output=json" returned: 0 in 0.179s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:13 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:14 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Instance destroyed successfully. Apr 20 10:30:14 user nova-compute[71283]: DEBUG nova.objects.instance [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lazy-loading 'resources' on Instance uuid cf494c03-d188-49c7-879e-29d2fa555549 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-294793328',display_name='tempest-DeleteServersTestJSON-server-294793328',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-294793328',id=1,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T10:28:32Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f986df042c594f71a4db3da582def690',ramdisk_id='',reservation_id='r-qdvy05xy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-DeleteServersTestJSON-1091162656',owner_user_name='tempest-DeleteServersTestJSON-1091162656-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:28:33Z,user_data=None,user_id='25f64a72e9ec4ad599a5c63bec4d092e',uuid=cf494c03-d188-49c7-879e-29d2fa555549,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eafcafd8-d33e-48b1-9947-cbce2d458180", "address": "fa:16:3e:de:6a:d4", "network": {"id": "bd16afd3-655a-4681-9793-21eff7495aee", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1742457713-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f986df042c594f71a4db3da582def690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafcafd8-d3", "ovs_interfaceid": "eafcafd8-d33e-48b1-9947-cbce2d458180", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Converting VIF {"id": "eafcafd8-d33e-48b1-9947-cbce2d458180", "address": "fa:16:3e:de:6a:d4", "network": {"id": "bd16afd3-655a-4681-9793-21eff7495aee", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1742457713-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f986df042c594f71a4db3da582def690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafcafd8-d3", "ovs_interfaceid": "eafcafd8-d33e-48b1-9947-cbce2d458180", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:de:6a:d4,bridge_name='br-int',has_traffic_filtering=True,id=eafcafd8-d33e-48b1-9947-cbce2d458180,network=Network(bd16afd3-655a-4681-9793-21eff7495aee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafcafd8-d3') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG os_vif [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:6a:d4,bridge_name='br-int',has_traffic_filtering=True,id=eafcafd8-d33e-48b1-9947-cbce2d458180,network=Network(bd16afd3-655a-4681-9793-21eff7495aee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafcafd8-d3') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeafcafd8-d3, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:30:14 user nova-compute[71283]: INFO os_vif [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:6a:d4,bridge_name='br-int',has_traffic_filtering=True,id=eafcafd8-d33e-48b1-9947-cbce2d458180,network=Network(bd16afd3-655a-4681-9793-21eff7495aee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafcafd8-d3') Apr 20 10:30:14 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Deleting instance files /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549_del Apr 20 10:30:14 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Deletion of /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549_del complete Apr 20 10:30:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:14 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Periodic task is updating the host stats, it is trying to get disk info for instance-00000001, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk: nova.exception.DiskNotFound: No disk at /opt/stack/data/nova/instances/cf494c03-d188-49c7-879e-29d2fa555549/disk Apr 20 10:30:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Checking UEFI support for host arch (x86_64) {{(pid=71283) supports_uefi /opt/stack/nova/nova/virt/libvirt/host.py:1722}} Apr 20 10:30:14 user nova-compute[71283]: INFO nova.virt.libvirt.host [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] UEFI support detected Apr 20 10:30:14 user nova-compute[71283]: INFO nova.compute.manager [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Took 0.76 seconds to destroy the instance on the hypervisor. Apr 20 10:30:14 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: cf494c03-d188-49c7-879e-29d2fa555549] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json" returned: 0 in 0.192s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.manager [req-64694571-8896-4167-b87d-47df8c0198a2 req-2e8de904-45dd-431e-90ed-47faf79804aa service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Received event network-vif-deleted-eafcafd8-d33e-48b1-9947-cbce2d458180 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:15 user nova-compute[71283]: INFO nova.compute.manager [req-64694571-8896-4167-b87d-47df8c0198a2 req-2e8de904-45dd-431e-90ed-47faf79804aa service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Neutron deleted interface eafcafd8-d33e-48b1-9947-cbce2d458180; detaching it from the instance and deleting it from the info cache Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.network.neutron [req-64694571-8896-4167-b87d-47df8c0198a2 req-2e8de904-45dd-431e-90ed-47faf79804aa service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:30:15 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:30:15 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=8481MB free_disk=26.525917053222656GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.manager [req-64694571-8896-4167-b87d-47df8c0198a2 req-2e8de904-45dd-431e-90ed-47faf79804aa service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Detach interface failed, port_id=eafcafd8-d33e-48b1-9947-cbce2d458180, reason: Instance cf494c03-d188-49c7-879e-29d2fa555549 could not be found. {{(pid=71283) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 20 10:30:15 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Took 0.99 seconds to deallocate network for instance. Apr 20 10:30:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance cf494c03-d188-49c7-879e-29d2fa555549 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance f48e6aa1-dd33-42a4-89c9-20691b628c70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 0f7a669c-28aa-42d2-8991-5852336c0f42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 64fb0d55-ef35-4386-86fe-00775b83a8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 9189f862-2e91-4420-ab64-54375c4f9466 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance bf7d300f-b748-446b-97ce-6cae12609e7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 23940921-698b-4e96-8aed-2d1c8c12299f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 7 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=1408MB phys_disk=40GB used_disk=7GB total_vcpus=12 used_vcpus=7 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing inventories for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating ProviderTree inventory for provider bdbc83bd-9307-4e20-8e3d-430b77499399 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating inventory in ProviderTree for provider bdbc83bd-9307-4e20-8e3d-430b77499399 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing aggregate associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, aggregates: None {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing trait associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.686s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.585s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.manager [req-d49e36f6-752e-4f12-b6c2-f75262a9fd9f req-671426e6-ce42-4ecd-975a-3a323fd464d6 service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Received event network-vif-plugged-eafcafd8-d33e-48b1-9947-cbce2d458180 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-d49e36f6-752e-4f12-b6c2-f75262a9fd9f req-671426e6-ce42-4ecd-975a-3a323fd464d6 service nova] Acquiring lock "cf494c03-d188-49c7-879e-29d2fa555549-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-d49e36f6-752e-4f12-b6c2-f75262a9fd9f req-671426e6-ce42-4ecd-975a-3a323fd464d6 service nova] Lock "cf494c03-d188-49c7-879e-29d2fa555549-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-d49e36f6-752e-4f12-b6c2-f75262a9fd9f req-671426e6-ce42-4ecd-975a-3a323fd464d6 service nova] Lock "cf494c03-d188-49c7-879e-29d2fa555549-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:15 user nova-compute[71283]: DEBUG nova.compute.manager [req-d49e36f6-752e-4f12-b6c2-f75262a9fd9f req-671426e6-ce42-4ecd-975a-3a323fd464d6 service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] No waiting events found dispatching network-vif-plugged-eafcafd8-d33e-48b1-9947-cbce2d458180 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:30:15 user nova-compute[71283]: WARNING nova.compute.manager [req-d49e36f6-752e-4f12-b6c2-f75262a9fd9f req-671426e6-ce42-4ecd-975a-3a323fd464d6 service nova] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Received unexpected event network-vif-plugged-eafcafd8-d33e-48b1-9947-cbce2d458180 for instance with vm_state deleted and task_state None. Apr 20 10:30:16 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:30:16 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:30:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.244s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:16 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Deleted allocations for instance cf494c03-d188-49c7-879e-29d2fa555549 Apr 20 10:30:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b2da604f-6cf6-4ab6-b5cf-9497d414fbb9 tempest-DeleteServersTestJSON-1091162656 tempest-DeleteServersTestJSON-1091162656-project-member] Lock "cf494c03-d188-49c7-879e-29d2fa555549" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.838s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:16 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:30:16 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:30:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-f48e6aa1-dd33-42a4-89c9-20691b628c70" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:30:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-f48e6aa1-dd33-42a4-89c9-20691b628c70" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:30:16 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:30:17 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Updating instance_info_cache with network_info: [{"id": "0954983e-5cb4-4486-9816-67f4f5f78b35", "address": "fa:16:3e:dc:a8:8c", "network": {"id": "06131689-128d-43da-b721-a9dda3f2e19f", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1783672421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2d1545b79af0497497e960cbd68aa8f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0954983e-5c", "ovs_interfaceid": "0954983e-5cb4-4486-9816-67f4f5f78b35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:30:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-f48e6aa1-dd33-42a4-89c9-20691b628c70" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:30:17 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:30:17 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:30:17 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:30:17 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:30:17 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:30:17 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:30:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:30:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:19 user nova-compute[71283]: DEBUG nova.compute.manager [req-3fdd2724-f19e-4751-a876-33c6410490c4 req-5c4cb22a-ac3d-453b-9cad-2ef4127475b6 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Received event network-changed-0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:19 user nova-compute[71283]: DEBUG nova.compute.manager [req-3fdd2724-f19e-4751-a876-33c6410490c4 req-5c4cb22a-ac3d-453b-9cad-2ef4127475b6 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Refreshing instance network info cache due to event network-changed-0954983e-5cb4-4486-9816-67f4f5f78b35. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:30:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-3fdd2724-f19e-4751-a876-33c6410490c4 req-5c4cb22a-ac3d-453b-9cad-2ef4127475b6 service nova] Acquiring lock "refresh_cache-f48e6aa1-dd33-42a4-89c9-20691b628c70" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:30:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-3fdd2724-f19e-4751-a876-33c6410490c4 req-5c4cb22a-ac3d-453b-9cad-2ef4127475b6 service nova] Acquired lock "refresh_cache-f48e6aa1-dd33-42a4-89c9-20691b628c70" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:30:19 user nova-compute[71283]: DEBUG nova.network.neutron [req-3fdd2724-f19e-4751-a876-33c6410490c4 req-5c4cb22a-ac3d-453b-9cad-2ef4127475b6 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Refreshing network info cache for port 0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:30:20 user nova-compute[71283]: DEBUG nova.network.neutron [req-3fdd2724-f19e-4751-a876-33c6410490c4 req-5c4cb22a-ac3d-453b-9cad-2ef4127475b6 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Updated VIF entry in instance network info cache for port 0954983e-5cb4-4486-9816-67f4f5f78b35. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:30:20 user nova-compute[71283]: DEBUG nova.network.neutron [req-3fdd2724-f19e-4751-a876-33c6410490c4 req-5c4cb22a-ac3d-453b-9cad-2ef4127475b6 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Updating instance_info_cache with network_info: [{"id": "0954983e-5cb4-4486-9816-67f4f5f78b35", "address": "fa:16:3e:dc:a8:8c", "network": {"id": "06131689-128d-43da-b721-a9dda3f2e19f", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1783672421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.122", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2d1545b79af0497497e960cbd68aa8f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0954983e-5c", "ovs_interfaceid": "0954983e-5cb4-4486-9816-67f4f5f78b35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:30:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-3fdd2724-f19e-4751-a876-33c6410490c4 req-5c4cb22a-ac3d-453b-9cad-2ef4127475b6 service nova] Releasing lock "refresh_cache-f48e6aa1-dd33-42a4-89c9-20691b628c70" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:30:20 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:30:20 user nova-compute[71283]: INFO nova.compute.manager [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] instance snapshotting Apr 20 10:30:20 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Beginning live snapshot process Apr 20 10:30:21 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json -f qcow2 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:21 user nova-compute[71283]: DEBUG nova.compute.manager [req-1842dd69-dc45-467e-aa46-8a125931e9d4 req-c7fa387c-9b3f-4e4a-8b80-e4d9a91b2399 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Received event network-changed-9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:21 user nova-compute[71283]: DEBUG nova.compute.manager [req-1842dd69-dc45-467e-aa46-8a125931e9d4 req-c7fa387c-9b3f-4e4a-8b80-e4d9a91b2399 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Refreshing instance network info cache due to event network-changed-9bd8e190-3b17-470b-9274-5060f7875bba. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:30:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1842dd69-dc45-467e-aa46-8a125931e9d4 req-c7fa387c-9b3f-4e4a-8b80-e4d9a91b2399 service nova] Acquiring lock "refresh_cache-0f7a669c-28aa-42d2-8991-5852336c0f42" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:30:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1842dd69-dc45-467e-aa46-8a125931e9d4 req-c7fa387c-9b3f-4e4a-8b80-e4d9a91b2399 service nova] Acquired lock "refresh_cache-0f7a669c-28aa-42d2-8991-5852336c0f42" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:30:21 user nova-compute[71283]: DEBUG nova.network.neutron [req-1842dd69-dc45-467e-aa46-8a125931e9d4 req-c7fa387c-9b3f-4e4a-8b80-e4d9a91b2399 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Refreshing network info cache for port 9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:30:21 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json -f qcow2" returned: 0 in 0.147s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:21 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json -f qcow2 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:21 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json -f qcow2" returned: 0 in 0.140s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:21 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:21 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.150s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:21 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmptnzziea7/67a57123fa2c4567bb30762afb8c94d8.delta 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:21 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmptnzziea7/67a57123fa2c4567bb30762afb8c94d8.delta 1073741824" returned: 0 in 0.047s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:21 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Quiescing instance not available: QEMU guest agent is not enabled. Apr 20 10:30:22 user nova-compute[71283]: DEBUG nova.compute.manager [req-fa40d346-6960-49ac-9a5f-d58af3b3633a req-fd0a69fa-f90a-4fe1-9251-b737e211353c service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Received event network-changed-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG nova.compute.manager [req-fa40d346-6960-49ac-9a5f-d58af3b3633a req-fd0a69fa-f90a-4fe1-9251-b737e211353c service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Refreshing instance network info cache due to event network-changed-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-fa40d346-6960-49ac-9a5f-d58af3b3633a req-fd0a69fa-f90a-4fe1-9251-b737e211353c service nova] Acquiring lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-fa40d346-6960-49ac-9a5f-d58af3b3633a req-fd0a69fa-f90a-4fe1-9251-b737e211353c service nova] Acquired lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG nova.network.neutron [req-fa40d346-6960-49ac-9a5f-d58af3b3633a req-fd0a69fa-f90a-4fe1-9251-b737e211353c service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Refreshing network info cache for port 851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG nova.network.neutron [req-1842dd69-dc45-467e-aa46-8a125931e9d4 req-c7fa387c-9b3f-4e4a-8b80-e4d9a91b2399 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Updated VIF entry in instance network info cache for port 9bd8e190-3b17-470b-9274-5060f7875bba. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG nova.network.neutron [req-1842dd69-dc45-467e-aa46-8a125931e9d4 req-c7fa387c-9b3f-4e4a-8b80-e4d9a91b2399 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Updating instance_info_cache with network_info: [{"id": "9bd8e190-3b17-470b-9274-5060f7875bba", "address": "fa:16:3e:78:4f:ac", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bd8e190-3b", "ovs_interfaceid": "9bd8e190-3b17-470b-9274-5060f7875bba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1842dd69-dc45-467e-aa46-8a125931e9d4 req-c7fa387c-9b3f-4e4a-8b80-e4d9a91b2399 service nova] Releasing lock "refresh_cache-0f7a669c-28aa-42d2-8991-5852336c0f42" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.guest [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71283) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG nova.network.neutron [req-fa40d346-6960-49ac-9a5f-d58af3b3633a req-fd0a69fa-f90a-4fe1-9251-b737e211353c service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Updated VIF entry in instance network info cache for port 851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG nova.network.neutron [req-fa40d346-6960-49ac-9a5f-d58af3b3633a req-fd0a69fa-f90a-4fe1-9251-b737e211353c service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Updating instance_info_cache with network_info: [{"id": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "address": "fa:16:3e:fe:b8:27", "network": {"id": "e68379ef-0781-4af6-be68-a311cdded61e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-509902278-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8d9c5dd66fd0496a8e92b41d98fe1727", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap851e6a9d-9d", "ovs_interfaceid": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-fa40d346-6960-49ac-9a5f-d58af3b3633a req-fd0a69fa-f90a-4fe1-9251-b737e211353c service nova] Releasing lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.guest [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71283) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 20 10:30:22 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 20 10:30:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "0f7a669c-28aa-42d2-8991-5852336c0f42" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:22 user nova-compute[71283]: INFO nova.compute.manager [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Terminating instance Apr 20 10:30:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG nova.privsep.utils [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71283) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmptnzziea7/67a57123fa2c4567bb30762afb8c94d8.delta /opt/stack/data/nova/instances/snapshots/tmptnzziea7/67a57123fa2c4567bb30762afb8c94d8 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmptnzziea7/67a57123fa2c4567bb30762afb8c94d8.delta /opt/stack/data/nova/instances/snapshots/tmptnzziea7/67a57123fa2c4567bb30762afb8c94d8" returned: 0 in 0.221s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:23 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Snapshot extracted, beginning image upload Apr 20 10:30:23 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Instance destroyed successfully. Apr 20 10:30:23 user nova-compute[71283]: DEBUG nova.objects.instance [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lazy-loading 'resources' on Instance uuid 0f7a669c-28aa-42d2-8991-5852336c0f42 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:30:23 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-268849146',display_name='tempest-AttachVolumeTestJSON-server-268849146',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-268849146',id=3,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGAh9xOp+18xwLGeRalB9BZJ/w8VTQdLWJWLt/GaIgucfc2qbhwsBZgwV0eAZLo1CJ17o1m08+BmrJtQvN99qzggaLadMbVfpQ2nwEGkvh5hKNizDvlBcY5GFmW0Xmz1BQ==',key_name='tempest-keypair-2096466478',keypairs=,launch_index=0,launched_at=2023-04-20T10:28:40Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1b4a2af680394ec889b4661753658b01',ramdisk_id='',reservation_id='r-du438i0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-1715839687',owner_user_name='tempest-AttachVolumeTestJSON-1715839687-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:28:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='67955d2e81c04b8d8dbcbe577303e025',uuid=0f7a669c-28aa-42d2-8991-5852336c0f42,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9bd8e190-3b17-470b-9274-5060f7875bba", "address": "fa:16:3e:78:4f:ac", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bd8e190-3b", "ovs_interfaceid": "9bd8e190-3b17-470b-9274-5060f7875bba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:30:23 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Converting VIF {"id": "9bd8e190-3b17-470b-9274-5060f7875bba", "address": "fa:16:3e:78:4f:ac", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bd8e190-3b", "ovs_interfaceid": "9bd8e190-3b17-470b-9274-5060f7875bba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:30:23 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:78:4f:ac,bridge_name='br-int',has_traffic_filtering=True,id=9bd8e190-3b17-470b-9274-5060f7875bba,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bd8e190-3b') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:30:23 user nova-compute[71283]: DEBUG os_vif [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:4f:ac,bridge_name='br-int',has_traffic_filtering=True,id=9bd8e190-3b17-470b-9274-5060f7875bba,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bd8e190-3b') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:30:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bd8e190-3b, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:30:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:30:23 user nova-compute[71283]: INFO os_vif [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:4f:ac,bridge_name='br-int',has_traffic_filtering=True,id=9bd8e190-3b17-470b-9274-5060f7875bba,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bd8e190-3b') Apr 20 10:30:23 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Deleting instance files /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42_del Apr 20 10:30:23 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Deletion of /opt/stack/data/nova/instances/0f7a669c-28aa-42d2-8991-5852336c0f42_del complete Apr 20 10:30:23 user nova-compute[71283]: INFO nova.compute.manager [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Took 0.83 seconds to destroy the instance on the hypervisor. Apr 20 10:30:23 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:30:23 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:30:23 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:30:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-b9e3515e-1946-4b8b-acdf-d89dcbe373c2 req-37937407-b43c-453b-98c6-f2af984ce6f1 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Received event network-vif-unplugged-9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b9e3515e-1946-4b8b-acdf-d89dcbe373c2 req-37937407-b43c-453b-98c6-f2af984ce6f1 service nova] Acquiring lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b9e3515e-1946-4b8b-acdf-d89dcbe373c2 req-37937407-b43c-453b-98c6-f2af984ce6f1 service nova] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b9e3515e-1946-4b8b-acdf-d89dcbe373c2 req-37937407-b43c-453b-98c6-f2af984ce6f1 service nova] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-b9e3515e-1946-4b8b-acdf-d89dcbe373c2 req-37937407-b43c-453b-98c6-f2af984ce6f1 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] No waiting events found dispatching network-vif-unplugged-9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:30:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-b9e3515e-1946-4b8b-acdf-d89dcbe373c2 req-37937407-b43c-453b-98c6-f2af984ce6f1 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Received event network-vif-unplugged-9bd8e190-3b17-470b-9274-5060f7875bba for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:30:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-b9e3515e-1946-4b8b-acdf-d89dcbe373c2 req-37937407-b43c-453b-98c6-f2af984ce6f1 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Received event network-vif-plugged-9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b9e3515e-1946-4b8b-acdf-d89dcbe373c2 req-37937407-b43c-453b-98c6-f2af984ce6f1 service nova] Acquiring lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b9e3515e-1946-4b8b-acdf-d89dcbe373c2 req-37937407-b43c-453b-98c6-f2af984ce6f1 service nova] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b9e3515e-1946-4b8b-acdf-d89dcbe373c2 req-37937407-b43c-453b-98c6-f2af984ce6f1 service nova] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-b9e3515e-1946-4b8b-acdf-d89dcbe373c2 req-37937407-b43c-453b-98c6-f2af984ce6f1 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] No waiting events found dispatching network-vif-plugged-9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:30:24 user nova-compute[71283]: WARNING nova.compute.manager [req-b9e3515e-1946-4b8b-acdf-d89dcbe373c2 req-37937407-b43c-453b-98c6-f2af984ce6f1 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Received unexpected event network-vif-plugged-9bd8e190-3b17-470b-9274-5060f7875bba for instance with vm_state active and task_state deleting. Apr 20 10:30:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:24 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:30:24 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Took 1.15 seconds to deallocate network for instance. Apr 20 10:30:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:25 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:30:25 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:30:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.247s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:25 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Deleted allocations for instance 0f7a669c-28aa-42d2-8991-5852336c0f42 Apr 20 10:30:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-21eb42ab-ecdf-4d6a-af9e-59fd7de86f65 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "0f7a669c-28aa-42d2-8991-5852336c0f42" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.426s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:25 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Snapshot image upload complete Apr 20 10:30:25 user nova-compute[71283]: INFO nova.compute.manager [None req-1af01c94-409a-4baf-a932-856bdef9d019 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Took 4.77 seconds to snapshot the instance on the hypervisor. Apr 20 10:30:25 user nova-compute[71283]: DEBUG nova.compute.manager [req-ca39cf93-ce8c-40f0-8ca9-aa14657d3dfa req-55f93bba-c9ec-4b85-b63e-d48c0f101029 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received event network-changed-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:25 user nova-compute[71283]: DEBUG nova.compute.manager [req-ca39cf93-ce8c-40f0-8ca9-aa14657d3dfa req-55f93bba-c9ec-4b85-b63e-d48c0f101029 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Refreshing instance network info cache due to event network-changed-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:30:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ca39cf93-ce8c-40f0-8ca9-aa14657d3dfa req-55f93bba-c9ec-4b85-b63e-d48c0f101029 service nova] Acquiring lock "refresh_cache-9189f862-2e91-4420-ab64-54375c4f9466" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:30:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ca39cf93-ce8c-40f0-8ca9-aa14657d3dfa req-55f93bba-c9ec-4b85-b63e-d48c0f101029 service nova] Acquired lock "refresh_cache-9189f862-2e91-4420-ab64-54375c4f9466" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:30:25 user nova-compute[71283]: DEBUG nova.network.neutron [req-ca39cf93-ce8c-40f0-8ca9-aa14657d3dfa req-55f93bba-c9ec-4b85-b63e-d48c0f101029 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Refreshing network info cache for port ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:30:26 user nova-compute[71283]: DEBUG nova.compute.manager [req-246832cd-0f25-409e-83a0-7e2457ea435c req-57a5f7d9-7591-40b9-93cc-f9c1e9e645d5 service nova] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Received event network-vif-deleted-9bd8e190-3b17-470b-9274-5060f7875bba {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:26 user nova-compute[71283]: DEBUG nova.network.neutron [req-ca39cf93-ce8c-40f0-8ca9-aa14657d3dfa req-55f93bba-c9ec-4b85-b63e-d48c0f101029 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Updated VIF entry in instance network info cache for port ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:30:26 user nova-compute[71283]: DEBUG nova.network.neutron [req-ca39cf93-ce8c-40f0-8ca9-aa14657d3dfa req-55f93bba-c9ec-4b85-b63e-d48c0f101029 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Updating instance_info_cache with network_info: [{"id": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "address": "fa:16:3e:39:2f:ad", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.2", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1a6c29-6d", "ovs_interfaceid": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:30:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ca39cf93-ce8c-40f0-8ca9-aa14657d3dfa req-55f93bba-c9ec-4b85-b63e-d48c0f101029 service nova] Releasing lock "refresh_cache-9189f862-2e91-4420-ab64-54375c4f9466" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:30:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "db38a0ab-0165-480d-bddb-cab50bcd22e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:28 user nova-compute[71283]: DEBUG nova.compute.manager [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:30:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:28 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:30:28 user nova-compute[71283]: INFO nova.compute.claims [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Claim successful on node user Apr 20 10:30:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:28 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:30:28 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:30:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.310s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:28 user nova-compute[71283]: DEBUG nova.compute.manager [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:30:28 user nova-compute[71283]: DEBUG nova.compute.manager [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:30:28 user nova-compute[71283]: DEBUG nova.network.neutron [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:30:28 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:30:28 user nova-compute[71283]: DEBUG nova.compute.manager [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:30:28 user nova-compute[71283]: DEBUG nova.policy [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '712be6d6876f4b2c9d796e406a43f8bf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7c3ecc1463ec42eea56f2890b032ef7a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:30:29 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Creating image(s) Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "/opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "/opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "/opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:30:29 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: cf494c03-d188-49c7-879e-29d2fa555549] VM Stopped (Lifecycle Event) Apr 20 10:30:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-bc244303-49f3-4ebf-a6e0-cc75cbfe4d79 None None] [instance: cf494c03-d188-49c7-879e-29d2fa555549] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.158s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk 1073741824" returned: 0 in 0.046s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.209s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.137s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Checking if we can resize image /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG nova.network.neutron [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Successfully created port: 7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk --force-share --output=json" returned: 0 in 0.126s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Cannot resize image /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG nova.objects.instance [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lazy-loading 'migration_context' on Instance uuid db38a0ab-0165-480d-bddb-cab50bcd22e4 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Ensure instance console log exists: /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.network.neutron [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Successfully updated port: 7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "refresh_cache-db38a0ab-0165-480d-bddb-cab50bcd22e4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquired lock "refresh_cache-db38a0ab-0165-480d-bddb-cab50bcd22e4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.network.neutron [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.compute.manager [req-61a04983-f5d9-42ef-978a-7e57fef8ebf0 req-a93edca3-b903-4f76-898d-829c2f778cda service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Received event network-changed-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.compute.manager [req-61a04983-f5d9-42ef-978a-7e57fef8ebf0 req-a93edca3-b903-4f76-898d-829c2f778cda service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Refreshing instance network info cache due to event network-changed-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-61a04983-f5d9-42ef-978a-7e57fef8ebf0 req-a93edca3-b903-4f76-898d-829c2f778cda service nova] Acquiring lock "refresh_cache-db38a0ab-0165-480d-bddb-cab50bcd22e4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.network.neutron [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.network.neutron [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Updating instance_info_cache with network_info: [{"id": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "address": "fa:16:3e:21:d2:da", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d16f167-ea", "ovs_interfaceid": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Releasing lock "refresh_cache-db38a0ab-0165-480d-bddb-cab50bcd22e4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Instance network_info: |[{"id": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "address": "fa:16:3e:21:d2:da", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d16f167-ea", "ovs_interfaceid": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-61a04983-f5d9-42ef-978a-7e57fef8ebf0 req-a93edca3-b903-4f76-898d-829c2f778cda service nova] Acquired lock "refresh_cache-db38a0ab-0165-480d-bddb-cab50bcd22e4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.network.neutron [req-61a04983-f5d9-42ef-978a-7e57fef8ebf0 req-a93edca3-b903-4f76-898d-829c2f778cda service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Refreshing network info cache for port 7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Start _get_guest_xml network_info=[{"id": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "address": "fa:16:3e:21:d2:da", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d16f167-ea", "ovs_interfaceid": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:30:30 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:30:30 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:30:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1784158863',display_name='tempest-VolumesAdminNegativeTest-server-1784158863',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1784158863',id=8,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7c3ecc1463ec42eea56f2890b032ef7a',ramdisk_id='',reservation_id='r-9l2ew4yx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-2134667240',owner_user_name='tempest-VolumesAdminNegativeTest-2134667240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:30:29Z,user_data=None,user_id='712be6d6876f4b2c9d796e406a43f8bf',uuid=db38a0ab-0165-480d-bddb-cab50bcd22e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "address": "fa:16:3e:21:d2:da", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d16f167-ea", "ovs_interfaceid": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Converting VIF {"id": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "address": "fa:16:3e:21:d2:da", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d16f167-ea", "ovs_interfaceid": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:d2:da,bridge_name='br-int',has_traffic_filtering=True,id=7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d16f167-ea') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.objects.instance [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lazy-loading 'pci_devices' on Instance uuid db38a0ab-0165-480d-bddb-cab50bcd22e4 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] End _get_guest_xml xml= Apr 20 10:30:30 user nova-compute[71283]: db38a0ab-0165-480d-bddb-cab50bcd22e4 Apr 20 10:30:30 user nova-compute[71283]: instance-00000008 Apr 20 10:30:30 user nova-compute[71283]: 131072 Apr 20 10:30:30 user nova-compute[71283]: 1 Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: tempest-VolumesAdminNegativeTest-server-1784158863 Apr 20 10:30:30 user nova-compute[71283]: 2023-04-20 10:30:30 Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: 128 Apr 20 10:30:30 user nova-compute[71283]: 1 Apr 20 10:30:30 user nova-compute[71283]: 0 Apr 20 10:30:30 user nova-compute[71283]: 0 Apr 20 10:30:30 user nova-compute[71283]: 1 Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: tempest-VolumesAdminNegativeTest-2134667240-project-member Apr 20 10:30:30 user nova-compute[71283]: tempest-VolumesAdminNegativeTest-2134667240 Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: OpenStack Foundation Apr 20 10:30:30 user nova-compute[71283]: OpenStack Nova Apr 20 10:30:30 user nova-compute[71283]: 0.0.0 Apr 20 10:30:30 user nova-compute[71283]: db38a0ab-0165-480d-bddb-cab50bcd22e4 Apr 20 10:30:30 user nova-compute[71283]: db38a0ab-0165-480d-bddb-cab50bcd22e4 Apr 20 10:30:30 user nova-compute[71283]: Virtual Machine Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: hvm Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Nehalem Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: /dev/urandom Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: Apr 20 10:30:30 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:30:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1784158863',display_name='tempest-VolumesAdminNegativeTest-server-1784158863',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1784158863',id=8,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7c3ecc1463ec42eea56f2890b032ef7a',ramdisk_id='',reservation_id='r-9l2ew4yx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-2134667240',owner_user_name='tempest-VolumesAdminNegativeTest-2134667240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:30:29Z,user_data=None,user_id='712be6d6876f4b2c9d796e406a43f8bf',uuid=db38a0ab-0165-480d-bddb-cab50bcd22e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "address": "fa:16:3e:21:d2:da", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d16f167-ea", "ovs_interfaceid": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Converting VIF {"id": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "address": "fa:16:3e:21:d2:da", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d16f167-ea", "ovs_interfaceid": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:d2:da,bridge_name='br-int',has_traffic_filtering=True,id=7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d16f167-ea') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG os_vif [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:d2:da,bridge_name='br-int',has_traffic_filtering=True,id=7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d16f167-ea') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d16f167-ea, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7d16f167-ea, col_values=(('external_ids', {'iface-id': '7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:21:d2:da', 'vm-uuid': 'db38a0ab-0165-480d-bddb-cab50bcd22e4'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:30 user nova-compute[71283]: INFO os_vif [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:d2:da,bridge_name='br-int',has_traffic_filtering=True,id=7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d16f167-ea') Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:30:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] No VIF found with MAC fa:16:3e:21:d2:da, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:30:31 user nova-compute[71283]: DEBUG nova.network.neutron [req-61a04983-f5d9-42ef-978a-7e57fef8ebf0 req-a93edca3-b903-4f76-898d-829c2f778cda service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Updated VIF entry in instance network info cache for port 7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:30:31 user nova-compute[71283]: DEBUG nova.network.neutron [req-61a04983-f5d9-42ef-978a-7e57fef8ebf0 req-a93edca3-b903-4f76-898d-829c2f778cda service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Updating instance_info_cache with network_info: [{"id": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "address": "fa:16:3e:21:d2:da", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d16f167-ea", "ovs_interfaceid": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:30:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-61a04983-f5d9-42ef-978a-7e57fef8ebf0 req-a93edca3-b903-4f76-898d-829c2f778cda service nova] Releasing lock "refresh_cache-db38a0ab-0165-480d-bddb-cab50bcd22e4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:30:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:32 user nova-compute[71283]: DEBUG nova.compute.manager [req-8f0e9a2e-4b86-453b-bd1e-a4c8194e6537 req-fcff3081-8ad5-48af-9565-2a50884ef454 service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Received event network-vif-plugged-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8f0e9a2e-4b86-453b-bd1e-a4c8194e6537 req-fcff3081-8ad5-48af-9565-2a50884ef454 service nova] Acquiring lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8f0e9a2e-4b86-453b-bd1e-a4c8194e6537 req-fcff3081-8ad5-48af-9565-2a50884ef454 service nova] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8f0e9a2e-4b86-453b-bd1e-a4c8194e6537 req-fcff3081-8ad5-48af-9565-2a50884ef454 service nova] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:32 user nova-compute[71283]: DEBUG nova.compute.manager [req-8f0e9a2e-4b86-453b-bd1e-a4c8194e6537 req-fcff3081-8ad5-48af-9565-2a50884ef454 service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] No waiting events found dispatching network-vif-plugged-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:30:32 user nova-compute[71283]: WARNING nova.compute.manager [req-8f0e9a2e-4b86-453b-bd1e-a4c8194e6537 req-fcff3081-8ad5-48af-9565-2a50884ef454 service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Received unexpected event network-vif-plugged-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee for instance with vm_state building and task_state spawning. Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:30:34 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] VM Resumed (Lifecycle Event) Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:30:34 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Instance spawned successfully. Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:30:34 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:30:34 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] VM Started (Lifecycle Event) Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:34 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:30:34 user nova-compute[71283]: INFO nova.compute.manager [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Took 5.31 seconds to spawn the instance on the hypervisor. Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:30:34 user nova-compute[71283]: INFO nova.compute.manager [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Took 5.90 seconds to build instance. Apr 20 10:30:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0c6203c-6eda-4bf3-862e-277dfbb8f9c7 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 5.987s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.compute.manager [req-ffaffcc9-6d43-4d2a-9a82-13833567bce4 req-1207a24b-5570-4329-91c8-dab1e60c7ef3 service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Received event network-vif-plugged-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ffaffcc9-6d43-4d2a-9a82-13833567bce4 req-1207a24b-5570-4329-91c8-dab1e60c7ef3 service nova] Acquiring lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ffaffcc9-6d43-4d2a-9a82-13833567bce4 req-1207a24b-5570-4329-91c8-dab1e60c7ef3 service nova] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ffaffcc9-6d43-4d2a-9a82-13833567bce4 req-1207a24b-5570-4329-91c8-dab1e60c7ef3 service nova] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:34 user nova-compute[71283]: DEBUG nova.compute.manager [req-ffaffcc9-6d43-4d2a-9a82-13833567bce4 req-1207a24b-5570-4329-91c8-dab1e60c7ef3 service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] No waiting events found dispatching network-vif-plugged-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:30:34 user nova-compute[71283]: WARNING nova.compute.manager [req-ffaffcc9-6d43-4d2a-9a82-13833567bce4 req-1207a24b-5570-4329-91c8-dab1e60c7ef3 service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Received unexpected event network-vif-plugged-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee for instance with vm_state active and task_state None. Apr 20 10:30:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:38 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:30:38 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] VM Stopped (Lifecycle Event) Apr 20 10:30:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-cbbe2e0e-821e-4953-aeb6-74a969732792 None None] [instance: 0f7a669c-28aa-42d2-8991-5852336c0f42] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:30:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:41 user nova-compute[71283]: DEBUG nova.compute.manager [req-34361cbe-1c86-457c-bce5-e531156732e4 req-7311309a-ccae-42df-acb6-cbb3bb55575e service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received event network-changed-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:41 user nova-compute[71283]: DEBUG nova.compute.manager [req-34361cbe-1c86-457c-bce5-e531156732e4 req-7311309a-ccae-42df-acb6-cbb3bb55575e service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Refreshing instance network info cache due to event network-changed-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:30:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-34361cbe-1c86-457c-bce5-e531156732e4 req-7311309a-ccae-42df-acb6-cbb3bb55575e service nova] Acquiring lock "refresh_cache-bf7d300f-b748-446b-97ce-6cae12609e7a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:30:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-34361cbe-1c86-457c-bce5-e531156732e4 req-7311309a-ccae-42df-acb6-cbb3bb55575e service nova] Acquired lock "refresh_cache-bf7d300f-b748-446b-97ce-6cae12609e7a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:30:41 user nova-compute[71283]: DEBUG nova.network.neutron [req-34361cbe-1c86-457c-bce5-e531156732e4 req-7311309a-ccae-42df-acb6-cbb3bb55575e service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Refreshing network info cache for port ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:30:41 user nova-compute[71283]: DEBUG nova.network.neutron [req-34361cbe-1c86-457c-bce5-e531156732e4 req-7311309a-ccae-42df-acb6-cbb3bb55575e service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Updated VIF entry in instance network info cache for port ffe07547-7e1b-4f9e-afa9-31ccb5dca61d. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:30:41 user nova-compute[71283]: DEBUG nova.network.neutron [req-34361cbe-1c86-457c-bce5-e531156732e4 req-7311309a-ccae-42df-acb6-cbb3bb55575e service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Updating instance_info_cache with network_info: [{"id": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "address": "fa:16:3e:5d:9f:5d", "network": {"id": "5a79f1a7-4d58-42fc-9c84-6f9951485092", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1428889344-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.26", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0f8979fdc4456db83cc88103856510", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapffe07547-7e", "ovs_interfaceid": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:30:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-34361cbe-1c86-457c-bce5-e531156732e4 req-7311309a-ccae-42df-acb6-cbb3bb55575e service nova] Releasing lock "refresh_cache-bf7d300f-b748-446b-97ce-6cae12609e7a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Acquiring lock "bf7d300f-b748-446b-97ce-6cae12609e7a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Acquiring lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:43 user nova-compute[71283]: INFO nova.compute.manager [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Terminating instance Apr 20 10:30:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG nova.compute.manager [req-c5b912b3-b9f4-4f83-9c9e-61ac941caa31 req-a9a76739-9e20-4d42-a90b-4775755576ef service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received event network-vif-unplugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-c5b912b3-b9f4-4f83-9c9e-61ac941caa31 req-a9a76739-9e20-4d42-a90b-4775755576ef service nova] Acquiring lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-c5b912b3-b9f4-4f83-9c9e-61ac941caa31 req-a9a76739-9e20-4d42-a90b-4775755576ef service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-c5b912b3-b9f4-4f83-9c9e-61ac941caa31 req-a9a76739-9e20-4d42-a90b-4775755576ef service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG nova.compute.manager [req-c5b912b3-b9f4-4f83-9c9e-61ac941caa31 req-a9a76739-9e20-4d42-a90b-4775755576ef service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] No waiting events found dispatching network-vif-unplugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG nova.compute.manager [req-c5b912b3-b9f4-4f83-9c9e-61ac941caa31 req-a9a76739-9e20-4d42-a90b-4775755576ef service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received event network-vif-unplugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:43 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Instance destroyed successfully. Apr 20 10:30:43 user nova-compute[71283]: DEBUG nova.objects.instance [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lazy-loading 'resources' on Instance uuid bf7d300f-b748-446b-97ce-6cae12609e7a {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-20T10:28:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1435500858',display_name='tempest-AttachSCSIVolumeTestJSON-server-1435500858',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1435500858',id=6,image_ref='9e8cd77e-f94e-4043-a953-9fb6fa201c58',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbO3gWjR+kfpqealBy4YcF5JnglKkdA7HMnKPq7F8n83ZuToSao0nhpqddYS4a7HYIvtQt1kPTWohrL7u4lMFSRJRaAHMC0ISjFRgdgwi8OOridVoioARVP9cUAseOLWA==',key_name='tempest-keypair-1802426226',keypairs=,launch_index=0,launched_at=2023-04-20T10:28:59Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='cb0f8979fdc4456db83cc88103856510',ramdisk_id='',reservation_id='r-8knp4xne',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e8cd77e-f94e-4043-a953-9fb6fa201c58',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1542709130',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1542709130-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:28:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6e5685756fe74751ad4c4fe85c30c951',uuid=bf7d300f-b748-446b-97ce-6cae12609e7a,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "address": "fa:16:3e:5d:9f:5d", "network": {"id": "5a79f1a7-4d58-42fc-9c84-6f9951485092", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1428889344-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.26", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0f8979fdc4456db83cc88103856510", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapffe07547-7e", "ovs_interfaceid": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Converting VIF {"id": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "address": "fa:16:3e:5d:9f:5d", "network": {"id": "5a79f1a7-4d58-42fc-9c84-6f9951485092", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1428889344-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.26", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0f8979fdc4456db83cc88103856510", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapffe07547-7e", "ovs_interfaceid": "ffe07547-7e1b-4f9e-afa9-31ccb5dca61d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5d:9f:5d,bridge_name='br-int',has_traffic_filtering=True,id=ffe07547-7e1b-4f9e-afa9-31ccb5dca61d,network=Network(5a79f1a7-4d58-42fc-9c84-6f9951485092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffe07547-7e') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG os_vif [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:9f:5d,bridge_name='br-int',has_traffic_filtering=True,id=ffe07547-7e1b-4f9e-afa9-31ccb5dca61d,network=Network(5a79f1a7-4d58-42fc-9c84-6f9951485092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffe07547-7e') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapffe07547-7e, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:43 user nova-compute[71283]: INFO os_vif [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:9f:5d,bridge_name='br-int',has_traffic_filtering=True,id=ffe07547-7e1b-4f9e-afa9-31ccb5dca61d,network=Network(5a79f1a7-4d58-42fc-9c84-6f9951485092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapffe07547-7e') Apr 20 10:30:43 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Deleting instance files /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a_del Apr 20 10:30:43 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Deletion of /opt/stack/data/nova/instances/bf7d300f-b748-446b-97ce-6cae12609e7a_del complete Apr 20 10:30:43 user nova-compute[71283]: INFO nova.compute.manager [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Took 0.83 seconds to destroy the instance on the hypervisor. Apr 20 10:30:43 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:30:43 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:30:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:44 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:30:44 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Took 0.81 seconds to deallocate network for instance. Apr 20 10:30:44 user nova-compute[71283]: DEBUG nova.compute.manager [req-063fc45c-8dfb-4a5d-bf0f-b2bc75fbb1d0 req-c9d9777a-2251-4b05-a7cf-b93166d03056 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received event network-vif-deleted-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:44 user nova-compute[71283]: INFO nova.compute.manager [req-063fc45c-8dfb-4a5d-bf0f-b2bc75fbb1d0 req-c9d9777a-2251-4b05-a7cf-b93166d03056 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Neutron deleted interface ffe07547-7e1b-4f9e-afa9-31ccb5dca61d; detaching it from the instance and deleting it from the info cache Apr 20 10:30:44 user nova-compute[71283]: DEBUG nova.network.neutron [req-063fc45c-8dfb-4a5d-bf0f-b2bc75fbb1d0 req-c9d9777a-2251-4b05-a7cf-b93166d03056 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:30:44 user nova-compute[71283]: DEBUG nova.compute.manager [req-063fc45c-8dfb-4a5d-bf0f-b2bc75fbb1d0 req-c9d9777a-2251-4b05-a7cf-b93166d03056 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Detach interface failed, port_id=ffe07547-7e1b-4f9e-afa9-31ccb5dca61d, reason: Instance bf7d300f-b748-446b-97ce-6cae12609e7a could not be found. {{(pid=71283) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 20 10:30:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:44 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:30:44 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.229s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:45 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Deleted allocations for instance bf7d300f-b748-446b-97ce-6cae12609e7a Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-bc830912-9819-4265-a5f0-79890d07adea tempest-AttachSCSIVolumeTestJSON-1542709130 tempest-AttachSCSIVolumeTestJSON-1542709130-project-member] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.160s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received event network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Acquiring lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] No waiting events found dispatching network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:30:45 user nova-compute[71283]: WARNING nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received unexpected event network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d for instance with vm_state deleted and task_state None. Apr 20 10:30:45 user nova-compute[71283]: DEBUG nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received event network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Acquiring lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] No waiting events found dispatching network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:30:45 user nova-compute[71283]: WARNING nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received unexpected event network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d for instance with vm_state deleted and task_state None. Apr 20 10:30:45 user nova-compute[71283]: DEBUG nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received event network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Acquiring lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] No waiting events found dispatching network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:30:45 user nova-compute[71283]: WARNING nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received unexpected event network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d for instance with vm_state deleted and task_state None. Apr 20 10:30:45 user nova-compute[71283]: DEBUG nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received event network-vif-unplugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Acquiring lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] No waiting events found dispatching network-vif-unplugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:30:45 user nova-compute[71283]: WARNING nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received unexpected event network-vif-unplugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d for instance with vm_state deleted and task_state None. Apr 20 10:30:45 user nova-compute[71283]: DEBUG nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received event network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Acquiring lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] Lock "bf7d300f-b748-446b-97ce-6cae12609e7a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:30:45 user nova-compute[71283]: DEBUG nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] No waiting events found dispatching network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:30:45 user nova-compute[71283]: WARNING nova.compute.manager [req-aaf67b1a-dd8b-4ebf-8aa3-5e509bf8acce req-f80ccc53-8886-4d73-a0b4-98421e1283a3 service nova] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Received unexpected event network-vif-plugged-ffe07547-7e1b-4f9e-afa9-31ccb5dca61d for instance with vm_state deleted and task_state None. Apr 20 10:30:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:58 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:30:58 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] VM Stopped (Lifecycle Event) Apr 20 10:30:58 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1b8b737c-0ac2-45b7-b437-ffed1c28a19b None None] [instance: bf7d300f-b748-446b-97ce-6cae12609e7a] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:30:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:30:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Acquiring lock "23940921-698b-4e96-8aed-2d1c8c12299f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "23940921-698b-4e96-8aed-2d1c8c12299f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Acquiring lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:02 user nova-compute[71283]: INFO nova.compute.manager [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Terminating instance Apr 20 10:31:02 user nova-compute[71283]: DEBUG nova.compute.manager [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG nova.compute.manager [req-2c5b8805-b211-462e-bb9a-0d82a911f93b req-ddc6a37e-3bc4-40f6-8134-6ed7f4f94cf3 service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Received event network-vif-unplugged-9c8c1513-213f-430e-8807-4172515e6170 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-2c5b8805-b211-462e-bb9a-0d82a911f93b req-ddc6a37e-3bc4-40f6-8134-6ed7f4f94cf3 service nova] Acquiring lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-2c5b8805-b211-462e-bb9a-0d82a911f93b req-ddc6a37e-3bc4-40f6-8134-6ed7f4f94cf3 service nova] Lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-2c5b8805-b211-462e-bb9a-0d82a911f93b req-ddc6a37e-3bc4-40f6-8134-6ed7f4f94cf3 service nova] Lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG nova.compute.manager [req-2c5b8805-b211-462e-bb9a-0d82a911f93b req-ddc6a37e-3bc4-40f6-8134-6ed7f4f94cf3 service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] No waiting events found dispatching network-vif-unplugged-9c8c1513-213f-430e-8807-4172515e6170 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG nova.compute.manager [req-2c5b8805-b211-462e-bb9a-0d82a911f93b req-ddc6a37e-3bc4-40f6-8134-6ed7f4f94cf3 service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Received event network-vif-unplugged-9c8c1513-213f-430e-8807-4172515e6170 for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:31:02 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Instance destroyed successfully. Apr 20 10:31:02 user nova-compute[71283]: DEBUG nova.objects.instance [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lazy-loading 'resources' on Instance uuid 23940921-698b-4e96-8aed-2d1c8c12299f {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:29:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-304888933',display_name='tempest-VolumesActionsTest-instance-304888933',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-304888933',id=7,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T10:29:17Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='3d65a1127ac04ac8976d9fbb08197248',ramdisk_id='',reservation_id='r-o86xj3x6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesActionsTest-640305762',owner_user_name='tempest-VolumesActionsTest-640305762-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:29:18Z,user_data=None,user_id='8355ed1609024bc1a30cbc422d8f90c2',uuid=23940921-698b-4e96-8aed-2d1c8c12299f,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9c8c1513-213f-430e-8807-4172515e6170", "address": "fa:16:3e:82:4d:f7", "network": {"id": "324b6a85-3443-483c-bd8e-4a128c9daf02", "bridge": "br-int", "label": "tempest-VolumesActionsTest-627525662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "3d65a1127ac04ac8976d9fbb08197248", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c8c1513-21", "ovs_interfaceid": "9c8c1513-213f-430e-8807-4172515e6170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Converting VIF {"id": "9c8c1513-213f-430e-8807-4172515e6170", "address": "fa:16:3e:82:4d:f7", "network": {"id": "324b6a85-3443-483c-bd8e-4a128c9daf02", "bridge": "br-int", "label": "tempest-VolumesActionsTest-627525662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "3d65a1127ac04ac8976d9fbb08197248", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c8c1513-21", "ovs_interfaceid": "9c8c1513-213f-430e-8807-4172515e6170", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:4d:f7,bridge_name='br-int',has_traffic_filtering=True,id=9c8c1513-213f-430e-8807-4172515e6170,network=Network(324b6a85-3443-483c-bd8e-4a128c9daf02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c8c1513-21') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG os_vif [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:4d:f7,bridge_name='br-int',has_traffic_filtering=True,id=9c8c1513-213f-430e-8807-4172515e6170,network=Network(324b6a85-3443-483c-bd8e-4a128c9daf02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c8c1513-21') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c8c1513-21, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:31:02 user nova-compute[71283]: INFO os_vif [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:4d:f7,bridge_name='br-int',has_traffic_filtering=True,id=9c8c1513-213f-430e-8807-4172515e6170,network=Network(324b6a85-3443-483c-bd8e-4a128c9daf02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c8c1513-21') Apr 20 10:31:02 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Deleting instance files /opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f_del Apr 20 10:31:02 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Deletion of /opt/stack/data/nova/instances/23940921-698b-4e96-8aed-2d1c8c12299f_del complete Apr 20 10:31:03 user nova-compute[71283]: INFO nova.compute.manager [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Took 0.63 seconds to destroy the instance on the hypervisor. Apr 20 10:31:03 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:31:03 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:31:03 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:31:03 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:31:03 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Took 0.51 seconds to deallocate network for instance. Apr 20 10:31:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:03 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:31:03 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:31:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.218s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:03 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Deleted allocations for instance 23940921-698b-4e96-8aed-2d1c8c12299f Apr 20 10:31:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-c0bd887b-7684-4f2a-b350-11faf6884902 tempest-VolumesActionsTest-640305762 tempest-VolumesActionsTest-640305762-project-member] Lock "23940921-698b-4e96-8aed-2d1c8c12299f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.538s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:04 user nova-compute[71283]: DEBUG nova.compute.manager [req-782bbe41-a2b5-42d7-8dba-a79c34ef1239 req-7794c99f-094a-49ce-9a3d-dfe186ed11b1 service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Received event network-vif-plugged-9c8c1513-213f-430e-8807-4172515e6170 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:04 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-782bbe41-a2b5-42d7-8dba-a79c34ef1239 req-7794c99f-094a-49ce-9a3d-dfe186ed11b1 service nova] Acquiring lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:04 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-782bbe41-a2b5-42d7-8dba-a79c34ef1239 req-7794c99f-094a-49ce-9a3d-dfe186ed11b1 service nova] Lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:04 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-782bbe41-a2b5-42d7-8dba-a79c34ef1239 req-7794c99f-094a-49ce-9a3d-dfe186ed11b1 service nova] Lock "23940921-698b-4e96-8aed-2d1c8c12299f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:04 user nova-compute[71283]: DEBUG nova.compute.manager [req-782bbe41-a2b5-42d7-8dba-a79c34ef1239 req-7794c99f-094a-49ce-9a3d-dfe186ed11b1 service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] No waiting events found dispatching network-vif-plugged-9c8c1513-213f-430e-8807-4172515e6170 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:31:04 user nova-compute[71283]: WARNING nova.compute.manager [req-782bbe41-a2b5-42d7-8dba-a79c34ef1239 req-7794c99f-094a-49ce-9a3d-dfe186ed11b1 service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Received unexpected event network-vif-plugged-9c8c1513-213f-430e-8807-4172515e6170 for instance with vm_state deleted and task_state None. Apr 20 10:31:04 user nova-compute[71283]: DEBUG nova.compute.manager [req-782bbe41-a2b5-42d7-8dba-a79c34ef1239 req-7794c99f-094a-49ce-9a3d-dfe186ed11b1 service nova] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Received event network-vif-deleted-9c8c1513-213f-430e-8807-4172515e6170 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:07 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:12 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:31:12 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:31:12 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:31:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:14 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:31:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:14 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:31:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json" returned: 0 in 0.159s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:16 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:31:16 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:31:16 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=8739MB free_disk=26.53630828857422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:31:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:16 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance f48e6aa1-dd33-42a4-89c9-20691b628c70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:31:16 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 64fb0d55-ef35-4386-86fe-00775b83a8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:31:16 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 9189f862-2e91-4420-ab64-54375c4f9466 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:31:16 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance db38a0ab-0165-480d-bddb-cab50bcd22e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:31:16 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:31:16 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:31:16 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:31:16 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:31:16 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:31:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.286s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:17 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:31:17 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:31:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:31:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:31:17 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:31:17 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:31:17 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] VM Stopped (Lifecycle Event) Apr 20 10:31:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:17 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f567584a-de1a-4e54-a9cb-872e5c4682d6 None None] [instance: 23940921-698b-4e96-8aed-2d1c8c12299f] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:31:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:17 user nova-compute[71283]: DEBUG nova.compute.manager [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:31:18 user nova-compute[71283]: INFO nova.compute.claims [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Claim successful on node user Apr 20 10:31:18 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG nova.compute.manager [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG nova.compute.manager [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG nova.network.neutron [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:31:18 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:31:18 user nova-compute[71283]: DEBUG nova.compute.manager [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG nova.policy [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '67955d2e81c04b8d8dbcbe577303e025', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1b4a2af680394ec889b4661753658b01', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG nova.compute.manager [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:31:18 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Creating image(s) Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "/opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "/opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "/opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Updating instance_info_cache with network_info: [{"id": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "address": "fa:16:3e:fe:b8:27", "network": {"id": "e68379ef-0781-4af6-be68-a311cdded61e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-509902278-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8d9c5dd66fd0496a8e92b41d98fe1727", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap851e6a9d-9d", "ovs_interfaceid": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.138s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.142s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/disk 1073741824" returned: 0 in 0.049s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.196s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.138s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Checking if we can resize image /opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Cannot resize image /opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG nova.objects.instance [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lazy-loading 'migration_context' on Instance uuid afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Ensure instance console log exists: /opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:20 user nova-compute[71283]: DEBUG nova.network.neutron [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Successfully created port: 3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:31:21 user nova-compute[71283]: DEBUG nova.network.neutron [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Successfully updated port: 3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:31:21 user nova-compute[71283]: DEBUG nova.compute.manager [req-a6d43192-e111-4588-997d-505a22c9d7d4 req-03aa7ae6-850f-4707-b8cf-66d042bb8d2f service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Received event network-changed-3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:21 user nova-compute[71283]: DEBUG nova.compute.manager [req-a6d43192-e111-4588-997d-505a22c9d7d4 req-03aa7ae6-850f-4707-b8cf-66d042bb8d2f service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Refreshing instance network info cache due to event network-changed-3a9efede-078b-4adc-9b72-465253f4444d. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:31:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a6d43192-e111-4588-997d-505a22c9d7d4 req-03aa7ae6-850f-4707-b8cf-66d042bb8d2f service nova] Acquiring lock "refresh_cache-afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:31:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a6d43192-e111-4588-997d-505a22c9d7d4 req-03aa7ae6-850f-4707-b8cf-66d042bb8d2f service nova] Acquired lock "refresh_cache-afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:31:21 user nova-compute[71283]: DEBUG nova.network.neutron [req-a6d43192-e111-4588-997d-505a22c9d7d4 req-03aa7ae6-850f-4707-b8cf-66d042bb8d2f service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Refreshing network info cache for port 3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:31:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "refresh_cache-afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:31:21 user nova-compute[71283]: DEBUG nova.network.neutron [req-a6d43192-e111-4588-997d-505a22c9d7d4 req-03aa7ae6-850f-4707-b8cf-66d042bb8d2f service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.network.neutron [req-a6d43192-e111-4588-997d-505a22c9d7d4 req-03aa7ae6-850f-4707-b8cf-66d042bb8d2f service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a6d43192-e111-4588-997d-505a22c9d7d4 req-03aa7ae6-850f-4707-b8cf-66d042bb8d2f service nova] Releasing lock "refresh_cache-afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquired lock "refresh_cache-afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.network.neutron [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.network.neutron [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.network.neutron [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Updating instance_info_cache with network_info: [{"id": "3a9efede-078b-4adc-9b72-465253f4444d", "address": "fa:16:3e:9b:f8:4f", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a9efede-07", "ovs_interfaceid": "3a9efede-078b-4adc-9b72-465253f4444d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Releasing lock "refresh_cache-afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Instance network_info: |[{"id": "3a9efede-078b-4adc-9b72-465253f4444d", "address": "fa:16:3e:9b:f8:4f", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a9efede-07", "ovs_interfaceid": "3a9efede-078b-4adc-9b72-465253f4444d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Start _get_guest_xml network_info=[{"id": "3a9efede-078b-4adc-9b72-465253f4444d", "address": "fa:16:3e:9b:f8:4f", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a9efede-07", "ovs_interfaceid": "3a9efede-078b-4adc-9b72-465253f4444d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:31:22 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:31:22 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:31:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1492863401',display_name='tempest-AttachVolumeTestJSON-server-1492863401',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1492863401',id=9,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCJ5hWqMeQoiTjX1acBo0XOFkMSjsm4q8qCy1ebdaywrtaD1l+xT9hnnoXuRjhZSD6F7eyAs+ruYJLSMyK+7gGsfIukb8brNhWiGX5TJye8A1hd9hjeQcPgsM9lbt6bNyw==',key_name='tempest-keypair-262941358',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1b4a2af680394ec889b4661753658b01',ramdisk_id='',reservation_id='r-z59spi1v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1715839687',owner_user_name='tempest-AttachVolumeTestJSON-1715839687-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:31:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='67955d2e81c04b8d8dbcbe577303e025',uuid=afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a9efede-078b-4adc-9b72-465253f4444d", "address": "fa:16:3e:9b:f8:4f", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a9efede-07", "ovs_interfaceid": "3a9efede-078b-4adc-9b72-465253f4444d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Converting VIF {"id": "3a9efede-078b-4adc-9b72-465253f4444d", "address": "fa:16:3e:9b:f8:4f", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a9efede-07", "ovs_interfaceid": "3a9efede-078b-4adc-9b72-465253f4444d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:f8:4f,bridge_name='br-int',has_traffic_filtering=True,id=3a9efede-078b-4adc-9b72-465253f4444d,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a9efede-07') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.objects.instance [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lazy-loading 'pci_devices' on Instance uuid afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] End _get_guest_xml xml= Apr 20 10:31:22 user nova-compute[71283]: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af Apr 20 10:31:22 user nova-compute[71283]: instance-00000009 Apr 20 10:31:22 user nova-compute[71283]: 131072 Apr 20 10:31:22 user nova-compute[71283]: 1 Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: tempest-AttachVolumeTestJSON-server-1492863401 Apr 20 10:31:22 user nova-compute[71283]: 2023-04-20 10:31:22 Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: 128 Apr 20 10:31:22 user nova-compute[71283]: 1 Apr 20 10:31:22 user nova-compute[71283]: 0 Apr 20 10:31:22 user nova-compute[71283]: 0 Apr 20 10:31:22 user nova-compute[71283]: 1 Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: tempest-AttachVolumeTestJSON-1715839687-project-member Apr 20 10:31:22 user nova-compute[71283]: tempest-AttachVolumeTestJSON-1715839687 Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: OpenStack Foundation Apr 20 10:31:22 user nova-compute[71283]: OpenStack Nova Apr 20 10:31:22 user nova-compute[71283]: 0.0.0 Apr 20 10:31:22 user nova-compute[71283]: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af Apr 20 10:31:22 user nova-compute[71283]: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af Apr 20 10:31:22 user nova-compute[71283]: Virtual Machine Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: hvm Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Nehalem Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: /dev/urandom Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: Apr 20 10:31:22 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:31:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1492863401',display_name='tempest-AttachVolumeTestJSON-server-1492863401',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1492863401',id=9,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCJ5hWqMeQoiTjX1acBo0XOFkMSjsm4q8qCy1ebdaywrtaD1l+xT9hnnoXuRjhZSD6F7eyAs+ruYJLSMyK+7gGsfIukb8brNhWiGX5TJye8A1hd9hjeQcPgsM9lbt6bNyw==',key_name='tempest-keypair-262941358',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1b4a2af680394ec889b4661753658b01',ramdisk_id='',reservation_id='r-z59spi1v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1715839687',owner_user_name='tempest-AttachVolumeTestJSON-1715839687-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:31:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='67955d2e81c04b8d8dbcbe577303e025',uuid=afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a9efede-078b-4adc-9b72-465253f4444d", "address": "fa:16:3e:9b:f8:4f", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a9efede-07", "ovs_interfaceid": "3a9efede-078b-4adc-9b72-465253f4444d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Converting VIF {"id": "3a9efede-078b-4adc-9b72-465253f4444d", "address": "fa:16:3e:9b:f8:4f", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a9efede-07", "ovs_interfaceid": "3a9efede-078b-4adc-9b72-465253f4444d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:f8:4f,bridge_name='br-int',has_traffic_filtering=True,id=3a9efede-078b-4adc-9b72-465253f4444d,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a9efede-07') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG os_vif [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:f8:4f,bridge_name='br-int',has_traffic_filtering=True,id=3a9efede-078b-4adc-9b72-465253f4444d,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a9efede-07') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a9efede-07, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3a9efede-07, col_values=(('external_ids', {'iface-id': '3a9efede-078b-4adc-9b72-465253f4444d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:f8:4f', 'vm-uuid': 'afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:22 user nova-compute[71283]: INFO os_vif [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:f8:4f,bridge_name='br-int',has_traffic_filtering=True,id=3a9efede-078b-4adc-9b72-465253f4444d,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a9efede-07') Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:31:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] No VIF found with MAC fa:16:3e:9b:f8:4f, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:31:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-40d3c7e7-3a61-43d1-a242-4b388123fe42 req-19278890-779f-40b1-a395-709831a751ff service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Received event network-vif-plugged-3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-40d3c7e7-3a61-43d1-a242-4b388123fe42 req-19278890-779f-40b1-a395-709831a751ff service nova] Acquiring lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-40d3c7e7-3a61-43d1-a242-4b388123fe42 req-19278890-779f-40b1-a395-709831a751ff service nova] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-40d3c7e7-3a61-43d1-a242-4b388123fe42 req-19278890-779f-40b1-a395-709831a751ff service nova] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-40d3c7e7-3a61-43d1-a242-4b388123fe42 req-19278890-779f-40b1-a395-709831a751ff service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] No waiting events found dispatching network-vif-plugged-3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:31:24 user nova-compute[71283]: WARNING nova.compute.manager [req-40d3c7e7-3a61-43d1-a242-4b388123fe42 req-19278890-779f-40b1-a395-709831a751ff service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Received unexpected event network-vif-plugged-3a9efede-078b-4adc-9b72-465253f4444d for instance with vm_state building and task_state spawning. Apr 20 10:31:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:31:26 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] VM Resumed (Lifecycle Event) Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.compute.manager [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:31:26 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Instance spawned successfully. Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:31:26 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:31:26 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] VM Started (Lifecycle Event) Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:31:26 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:31:26 user nova-compute[71283]: INFO nova.compute.manager [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Took 7.67 seconds to spawn the instance on the hypervisor. Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.compute.manager [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.compute.manager [req-c4743f78-b09d-4d5c-9786-72528946711f req-cb8c8649-f44a-48ad-bb2f-53a210b2f1ba service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Received event network-vif-plugged-3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-c4743f78-b09d-4d5c-9786-72528946711f req-cb8c8649-f44a-48ad-bb2f-53a210b2f1ba service nova] Acquiring lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-c4743f78-b09d-4d5c-9786-72528946711f req-cb8c8649-f44a-48ad-bb2f-53a210b2f1ba service nova] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-c4743f78-b09d-4d5c-9786-72528946711f req-cb8c8649-f44a-48ad-bb2f-53a210b2f1ba service nova] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG nova.compute.manager [req-c4743f78-b09d-4d5c-9786-72528946711f req-cb8c8649-f44a-48ad-bb2f-53a210b2f1ba service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] No waiting events found dispatching network-vif-plugged-3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:31:26 user nova-compute[71283]: WARNING nova.compute.manager [req-c4743f78-b09d-4d5c-9786-72528946711f req-cb8c8649-f44a-48ad-bb2f-53a210b2f1ba service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Received unexpected event network-vif-plugged-3a9efede-078b-4adc-9b72-465253f4444d for instance with vm_state building and task_state spawning. Apr 20 10:31:26 user nova-compute[71283]: INFO nova.compute.manager [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Took 8.40 seconds to build instance. Apr 20 10:31:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-aa17d251-e31e-465b-88bb-b227c0b8ca6c tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.518s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:27 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:41 user nova-compute[71283]: DEBUG nova.compute.manager [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:31:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:41 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:31:41 user nova-compute[71283]: INFO nova.compute.claims [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Claim successful on node user Apr 20 10:31:41 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:31:41 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.368s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG nova.compute.manager [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG nova.compute.manager [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG nova.network.neutron [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:31:42 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:31:42 user nova-compute[71283]: DEBUG nova.compute.manager [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG nova.policy [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a04be58cd354d508616edd9d5eeff54', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12e621999b00481c839affc4e83ce37c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG nova.compute.manager [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:31:42 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Creating image(s) Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "/opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "/opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "/opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.163s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.142s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk 1073741824" returned: 0 in 0.049s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.195s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.143s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Checking if we can resize image /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:42 user nova-compute[71283]: DEBUG nova.network.neutron [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Successfully created port: f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk --force-share --output=json" returned: 0 in 0.214s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Cannot resize image /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG nova.objects.instance [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lazy-loading 'migration_context' on Instance uuid 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Ensure instance console log exists: /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG nova.network.neutron [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Successfully updated port: f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "refresh_cache-319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquired lock "refresh_cache-319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG nova.network.neutron [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG nova.compute.manager [req-03c01caa-cdba-4bc2-a2f9-a2635a756776 req-592034d8-11f7-4a49-8e5d-3f512612c772 service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Received event network-changed-f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG nova.compute.manager [req-03c01caa-cdba-4bc2-a2f9-a2635a756776 req-592034d8-11f7-4a49-8e5d-3f512612c772 service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Refreshing instance network info cache due to event network-changed-f6cef929-1703-4207-bef4-18cfa6e9f6fc. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-03c01caa-cdba-4bc2-a2f9-a2635a756776 req-592034d8-11f7-4a49-8e5d-3f512612c772 service nova] Acquiring lock "refresh_cache-319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:31:43 user nova-compute[71283]: DEBUG nova.network.neutron [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.network.neutron [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Updating instance_info_cache with network_info: [{"id": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "address": "fa:16:3e:cd:ca:12", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6cef929-17", "ovs_interfaceid": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Releasing lock "refresh_cache-319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.compute.manager [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Instance network_info: |[{"id": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "address": "fa:16:3e:cd:ca:12", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6cef929-17", "ovs_interfaceid": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-03c01caa-cdba-4bc2-a2f9-a2635a756776 req-592034d8-11f7-4a49-8e5d-3f512612c772 service nova] Acquired lock "refresh_cache-319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.network.neutron [req-03c01caa-cdba-4bc2-a2f9-a2635a756776 req-592034d8-11f7-4a49-8e5d-3f512612c772 service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Refreshing network info cache for port f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Start _get_guest_xml network_info=[{"id": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "address": "fa:16:3e:cd:ca:12", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6cef929-17", "ovs_interfaceid": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:31:44 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:31:44 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:31:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1210707222',display_name='tempest-AttachVolumeNegativeTest-server-1210707222',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1210707222',id=10,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD+9jK7t3ClieA+FSlSLflLYtwuLOAFMxwNPVMQvmnWXsj1kwhJZoZbs5/2/0cGkEuG79cH7XZnatiE6eDZhPpV+eOa2IU0p9RSwsraobttdhAQWALDgd6b3pAqIqOw/YA==',key_name='tempest-keypair-859145494',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12e621999b00481c839affc4e83ce37c',ramdisk_id='',reservation_id='r-7pfhccuf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1619866573',owner_user_name='tempest-AttachVolumeNegativeTest-1619866573-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:31:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a04be58cd354d508616edd9d5eeff54',uuid=319eafd3-e6ac-4fcc-92b1-a5f4e60952e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "address": "fa:16:3e:cd:ca:12", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6cef929-17", "ovs_interfaceid": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converting VIF {"id": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "address": "fa:16:3e:cd:ca:12", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6cef929-17", "ovs_interfaceid": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:ca:12,bridge_name='br-int',has_traffic_filtering=True,id=f6cef929-1703-4207-bef4-18cfa6e9f6fc,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6cef929-17') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.objects.instance [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lazy-loading 'pci_devices' on Instance uuid 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] End _get_guest_xml xml= Apr 20 10:31:44 user nova-compute[71283]: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8 Apr 20 10:31:44 user nova-compute[71283]: instance-0000000a Apr 20 10:31:44 user nova-compute[71283]: 131072 Apr 20 10:31:44 user nova-compute[71283]: 1 Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: tempest-AttachVolumeNegativeTest-server-1210707222 Apr 20 10:31:44 user nova-compute[71283]: 2023-04-20 10:31:44 Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: 128 Apr 20 10:31:44 user nova-compute[71283]: 1 Apr 20 10:31:44 user nova-compute[71283]: 0 Apr 20 10:31:44 user nova-compute[71283]: 0 Apr 20 10:31:44 user nova-compute[71283]: 1 Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: tempest-AttachVolumeNegativeTest-1619866573-project-member Apr 20 10:31:44 user nova-compute[71283]: tempest-AttachVolumeNegativeTest-1619866573 Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: OpenStack Foundation Apr 20 10:31:44 user nova-compute[71283]: OpenStack Nova Apr 20 10:31:44 user nova-compute[71283]: 0.0.0 Apr 20 10:31:44 user nova-compute[71283]: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8 Apr 20 10:31:44 user nova-compute[71283]: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8 Apr 20 10:31:44 user nova-compute[71283]: Virtual Machine Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: hvm Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Nehalem Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: /dev/urandom Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: Apr 20 10:31:44 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:31:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1210707222',display_name='tempest-AttachVolumeNegativeTest-server-1210707222',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1210707222',id=10,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD+9jK7t3ClieA+FSlSLflLYtwuLOAFMxwNPVMQvmnWXsj1kwhJZoZbs5/2/0cGkEuG79cH7XZnatiE6eDZhPpV+eOa2IU0p9RSwsraobttdhAQWALDgd6b3pAqIqOw/YA==',key_name='tempest-keypair-859145494',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12e621999b00481c839affc4e83ce37c',ramdisk_id='',reservation_id='r-7pfhccuf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1619866573',owner_user_name='tempest-AttachVolumeNegativeTest-1619866573-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:31:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a04be58cd354d508616edd9d5eeff54',uuid=319eafd3-e6ac-4fcc-92b1-a5f4e60952e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "address": "fa:16:3e:cd:ca:12", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6cef929-17", "ovs_interfaceid": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converting VIF {"id": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "address": "fa:16:3e:cd:ca:12", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6cef929-17", "ovs_interfaceid": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:ca:12,bridge_name='br-int',has_traffic_filtering=True,id=f6cef929-1703-4207-bef4-18cfa6e9f6fc,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6cef929-17') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG os_vif [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:ca:12,bridge_name='br-int',has_traffic_filtering=True,id=f6cef929-1703-4207-bef4-18cfa6e9f6fc,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6cef929-17') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6cef929-17, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf6cef929-17, col_values=(('external_ids', {'iface-id': 'f6cef929-1703-4207-bef4-18cfa6e9f6fc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cd:ca:12', 'vm-uuid': '319eafd3-e6ac-4fcc-92b1-a5f4e60952e8'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:44 user nova-compute[71283]: INFO os_vif [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:ca:12,bridge_name='br-int',has_traffic_filtering=True,id=f6cef929-1703-4207-bef4-18cfa6e9f6fc,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6cef929-17') Apr 20 10:31:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] No VIF found with MAC fa:16:3e:cd:ca:12, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.network.neutron [req-03c01caa-cdba-4bc2-a2f9-a2635a756776 req-592034d8-11f7-4a49-8e5d-3f512612c772 service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Updated VIF entry in instance network info cache for port f6cef929-1703-4207-bef4-18cfa6e9f6fc. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG nova.network.neutron [req-03c01caa-cdba-4bc2-a2f9-a2635a756776 req-592034d8-11f7-4a49-8e5d-3f512612c772 service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Updating instance_info_cache with network_info: [{"id": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "address": "fa:16:3e:cd:ca:12", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6cef929-17", "ovs_interfaceid": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:31:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-03c01caa-cdba-4bc2-a2f9-a2635a756776 req-592034d8-11f7-4a49-8e5d-3f512612c772 service nova] Releasing lock "refresh_cache-319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:31:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:46 user nova-compute[71283]: DEBUG nova.compute.manager [req-496770b2-5f09-4b50-89da-df7738a2e488 req-54a27cd3-2e45-45f4-9c5b-73c59af739ca service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Received event network-vif-plugged-f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:46 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-496770b2-5f09-4b50-89da-df7738a2e488 req-54a27cd3-2e45-45f4-9c5b-73c59af739ca service nova] Acquiring lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:46 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-496770b2-5f09-4b50-89da-df7738a2e488 req-54a27cd3-2e45-45f4-9c5b-73c59af739ca service nova] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:46 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-496770b2-5f09-4b50-89da-df7738a2e488 req-54a27cd3-2e45-45f4-9c5b-73c59af739ca service nova] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:46 user nova-compute[71283]: DEBUG nova.compute.manager [req-496770b2-5f09-4b50-89da-df7738a2e488 req-54a27cd3-2e45-45f4-9c5b-73c59af739ca service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] No waiting events found dispatching network-vif-plugged-f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:31:46 user nova-compute[71283]: WARNING nova.compute.manager [req-496770b2-5f09-4b50-89da-df7738a2e488 req-54a27cd3-2e45-45f4-9c5b-73c59af739ca service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Received unexpected event network-vif-plugged-f6cef929-1703-4207-bef4-18cfa6e9f6fc for instance with vm_state building and task_state spawning. Apr 20 10:31:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:47 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:31:47 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] VM Resumed (Lifecycle Event) Apr 20 10:31:47 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Instance spawned successfully. Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:47 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:31:47 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] VM Started (Lifecycle Event) Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:31:47 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:31:47 user nova-compute[71283]: INFO nova.compute.manager [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Took 5.51 seconds to spawn the instance on the hypervisor. Apr 20 10:31:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:31:47 user nova-compute[71283]: INFO nova.compute.manager [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Took 6.29 seconds to build instance. Apr 20 10:31:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-127ac422-29a9-4056-8c18-5a36adcaab5d tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.378s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:48 user nova-compute[71283]: DEBUG nova.compute.manager [req-ac6c1229-5c03-447b-b6fc-79aad4b5bec1 req-149e6291-f242-42cb-9d08-4a69e5e41e7b service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Received event network-vif-plugged-f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ac6c1229-5c03-447b-b6fc-79aad4b5bec1 req-149e6291-f242-42cb-9d08-4a69e5e41e7b service nova] Acquiring lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ac6c1229-5c03-447b-b6fc-79aad4b5bec1 req-149e6291-f242-42cb-9d08-4a69e5e41e7b service nova] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ac6c1229-5c03-447b-b6fc-79aad4b5bec1 req-149e6291-f242-42cb-9d08-4a69e5e41e7b service nova] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:48 user nova-compute[71283]: DEBUG nova.compute.manager [req-ac6c1229-5c03-447b-b6fc-79aad4b5bec1 req-149e6291-f242-42cb-9d08-4a69e5e41e7b service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] No waiting events found dispatching network-vif-plugged-f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:31:48 user nova-compute[71283]: WARNING nova.compute.manager [req-ac6c1229-5c03-447b-b6fc-79aad4b5bec1 req-149e6291-f242-42cb-9d08-4a69e5e41e7b service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Received unexpected event network-vif-plugged-f6cef929-1703-4207-bef4-18cfa6e9f6fc for instance with vm_state active and task_state None. Apr 20 10:31:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:50 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "e1115ad2-b858-4859-b1bb-2175f7eab867" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:51 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:31:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:51 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:31:51 user nova-compute[71283]: INFO nova.compute.claims [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Claim successful on node user Apr 20 10:31:52 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG nova.network.neutron [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:31:52 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:31:52 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG nova.policy [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '729f2fdafa8e471e8f0de0c8323c36b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd6d8880a9263444cba94725a83974403', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:31:52 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Creating image(s) Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "/opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "/opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "/opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.144s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG nova.compute.manager [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.143s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:31:52 user nova-compute[71283]: INFO nova.compute.claims [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Claim successful on node user Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk 1073741824" returned: 0 in 0.058s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.204s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.133s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Checking if we can resize image /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:31:52 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.network.neutron [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Successfully created port: c47ac25a-c0d0-4443-8759-c573281b63f9 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json" returned: 0 in 0.166s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Cannot resize image /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.objects.instance [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lazy-loading 'migration_context' on Instance uuid e1115ad2-b858-4859-b1bb-2175f7eab867 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Ensure instance console log exists: /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.427s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.compute.manager [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.compute.manager [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.network.neutron [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:31:53 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.compute.manager [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.policy [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '729f2fdafa8e471e8f0de0c8323c36b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd6d8880a9263444cba94725a83974403', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.compute.manager [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:31:53 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Creating image(s) Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "/opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "/opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "/opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.138s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk 1073741824" returned: 0 in 0.047s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.183s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.network.neutron [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Successfully updated port: c47ac25a-c0d0-4443-8759-c573281b63f9 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-f6d1ec62-e29d-47a3-8eca-20c161eb9848 req-450c5071-a922-4d16-a502-67787f410125 service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Received event network-changed-c47ac25a-c0d0-4443-8759-c573281b63f9 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-f6d1ec62-e29d-47a3-8eca-20c161eb9848 req-450c5071-a922-4d16-a502-67787f410125 service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Refreshing instance network info cache due to event network-changed-c47ac25a-c0d0-4443-8759-c573281b63f9. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f6d1ec62-e29d-47a3-8eca-20c161eb9848 req-450c5071-a922-4d16-a502-67787f410125 service nova] Acquiring lock "refresh_cache-e1115ad2-b858-4859-b1bb-2175f7eab867" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f6d1ec62-e29d-47a3-8eca-20c161eb9848 req-450c5071-a922-4d16-a502-67787f410125 service nova] Acquired lock "refresh_cache-e1115ad2-b858-4859-b1bb-2175f7eab867" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.network.neutron [req-f6d1ec62-e29d-47a3-8eca-20c161eb9848 req-450c5071-a922-4d16-a502-67787f410125 service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Refreshing network info cache for port c47ac25a-c0d0-4443-8759-c573281b63f9 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "refresh_cache-e1115ad2-b858-4859-b1bb-2175f7eab867" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.network.neutron [req-f6d1ec62-e29d-47a3-8eca-20c161eb9848 req-450c5071-a922-4d16-a502-67787f410125 service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.126s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Checking if we can resize image /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:31:53 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.network.neutron [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Successfully created port: 56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Cannot resize image /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.objects.instance [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lazy-loading 'migration_context' on Instance uuid 50cff6dc-1947-417a-8c5f-0b10de5dfd3a {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Ensure instance console log exists: /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.network.neutron [req-f6d1ec62-e29d-47a3-8eca-20c161eb9848 req-450c5071-a922-4d16-a502-67787f410125 service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f6d1ec62-e29d-47a3-8eca-20c161eb9848 req-450c5071-a922-4d16-a502-67787f410125 service nova] Releasing lock "refresh_cache-e1115ad2-b858-4859-b1bb-2175f7eab867" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquired lock "refresh_cache-e1115ad2-b858-4859-b1bb-2175f7eab867" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.network.neutron [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.network.neutron [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.network.neutron [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Updating instance_info_cache with network_info: [{"id": "c47ac25a-c0d0-4443-8759-c573281b63f9", "address": "fa:16:3e:bd:ab:60", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47ac25a-c0", "ovs_interfaceid": "c47ac25a-c0d0-4443-8759-c573281b63f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Releasing lock "refresh_cache-e1115ad2-b858-4859-b1bb-2175f7eab867" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Instance network_info: |[{"id": "c47ac25a-c0d0-4443-8759-c573281b63f9", "address": "fa:16:3e:bd:ab:60", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47ac25a-c0", "ovs_interfaceid": "c47ac25a-c0d0-4443-8759-c573281b63f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Start _get_guest_xml network_info=[{"id": "c47ac25a-c0d0-4443-8759-c573281b63f9", "address": "fa:16:3e:bd:ab:60", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47ac25a-c0", "ovs_interfaceid": "c47ac25a-c0d0-4443-8759-c573281b63f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:31:54 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:31:54 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:31:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1901028417',display_name='tempest-ServerRescueNegativeTestJSON-server-1901028417',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1901028417',id=11,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d6d8880a9263444cba94725a83974403',ramdisk_id='',reservation_id='r-6r1f814t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-678567479',owner_user_name='tempest-ServerRescueNegativeTestJSON-678567479-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:31:52Z,user_data=None,user_id='729f2fdafa8e471e8f0de0c8323c36b5',uuid=e1115ad2-b858-4859-b1bb-2175f7eab867,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c47ac25a-c0d0-4443-8759-c573281b63f9", "address": "fa:16:3e:bd:ab:60", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47ac25a-c0", "ovs_interfaceid": "c47ac25a-c0d0-4443-8759-c573281b63f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converting VIF {"id": "c47ac25a-c0d0-4443-8759-c573281b63f9", "address": "fa:16:3e:bd:ab:60", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47ac25a-c0", "ovs_interfaceid": "c47ac25a-c0d0-4443-8759-c573281b63f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:ab:60,bridge_name='br-int',has_traffic_filtering=True,id=c47ac25a-c0d0-4443-8759-c573281b63f9,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47ac25a-c0') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.objects.instance [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lazy-loading 'pci_devices' on Instance uuid e1115ad2-b858-4859-b1bb-2175f7eab867 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] End _get_guest_xml xml= Apr 20 10:31:54 user nova-compute[71283]: e1115ad2-b858-4859-b1bb-2175f7eab867 Apr 20 10:31:54 user nova-compute[71283]: instance-0000000b Apr 20 10:31:54 user nova-compute[71283]: 131072 Apr 20 10:31:54 user nova-compute[71283]: 1 Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: tempest-ServerRescueNegativeTestJSON-server-1901028417 Apr 20 10:31:54 user nova-compute[71283]: 2023-04-20 10:31:54 Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: 128 Apr 20 10:31:54 user nova-compute[71283]: 1 Apr 20 10:31:54 user nova-compute[71283]: 0 Apr 20 10:31:54 user nova-compute[71283]: 0 Apr 20 10:31:54 user nova-compute[71283]: 1 Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: tempest-ServerRescueNegativeTestJSON-678567479-project-member Apr 20 10:31:54 user nova-compute[71283]: tempest-ServerRescueNegativeTestJSON-678567479 Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: OpenStack Foundation Apr 20 10:31:54 user nova-compute[71283]: OpenStack Nova Apr 20 10:31:54 user nova-compute[71283]: 0.0.0 Apr 20 10:31:54 user nova-compute[71283]: e1115ad2-b858-4859-b1bb-2175f7eab867 Apr 20 10:31:54 user nova-compute[71283]: e1115ad2-b858-4859-b1bb-2175f7eab867 Apr 20 10:31:54 user nova-compute[71283]: Virtual Machine Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: hvm Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Nehalem Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: /dev/urandom Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: Apr 20 10:31:54 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:31:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1901028417',display_name='tempest-ServerRescueNegativeTestJSON-server-1901028417',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1901028417',id=11,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d6d8880a9263444cba94725a83974403',ramdisk_id='',reservation_id='r-6r1f814t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-678567479',owner_user_name='tempest-ServerRescueNegativeTestJSON-678567479-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:31:52Z,user_data=None,user_id='729f2fdafa8e471e8f0de0c8323c36b5',uuid=e1115ad2-b858-4859-b1bb-2175f7eab867,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c47ac25a-c0d0-4443-8759-c573281b63f9", "address": "fa:16:3e:bd:ab:60", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47ac25a-c0", "ovs_interfaceid": "c47ac25a-c0d0-4443-8759-c573281b63f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converting VIF {"id": "c47ac25a-c0d0-4443-8759-c573281b63f9", "address": "fa:16:3e:bd:ab:60", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47ac25a-c0", "ovs_interfaceid": "c47ac25a-c0d0-4443-8759-c573281b63f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:ab:60,bridge_name='br-int',has_traffic_filtering=True,id=c47ac25a-c0d0-4443-8759-c573281b63f9,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47ac25a-c0') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG os_vif [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:ab:60,bridge_name='br-int',has_traffic_filtering=True,id=c47ac25a-c0d0-4443-8759-c573281b63f9,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47ac25a-c0') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc47ac25a-c0, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc47ac25a-c0, col_values=(('external_ids', {'iface-id': 'c47ac25a-c0d0-4443-8759-c573281b63f9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:ab:60', 'vm-uuid': 'e1115ad2-b858-4859-b1bb-2175f7eab867'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:54 user nova-compute[71283]: INFO os_vif [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:ab:60,bridge_name='br-int',has_traffic_filtering=True,id=c47ac25a-c0d0-4443-8759-c573281b63f9,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47ac25a-c0') Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] No VIF found with MAC fa:16:3e:bd:ab:60, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.network.neutron [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Successfully updated port: 56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "refresh_cache-50cff6dc-1947-417a-8c5f-0b10de5dfd3a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquired lock "refresh_cache-50cff6dc-1947-417a-8c5f-0b10de5dfd3a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.network.neutron [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.compute.manager [req-4bd1090f-a122-44c6-9032-ca4ce1294dc5 req-41d0aa64-54ba-4f84-b960-272f0982dfd9 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received event network-changed-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.compute.manager [req-4bd1090f-a122-44c6-9032-ca4ce1294dc5 req-41d0aa64-54ba-4f84-b960-272f0982dfd9 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Refreshing instance network info cache due to event network-changed-56f36d99-9847-4d4d-bf9b-c1c2244bac79. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4bd1090f-a122-44c6-9032-ca4ce1294dc5 req-41d0aa64-54ba-4f84-b960-272f0982dfd9 service nova] Acquiring lock "refresh_cache-50cff6dc-1947-417a-8c5f-0b10de5dfd3a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:31:54 user nova-compute[71283]: DEBUG nova.network.neutron [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.network.neutron [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Updating instance_info_cache with network_info: [{"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Releasing lock "refresh_cache-50cff6dc-1947-417a-8c5f-0b10de5dfd3a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.compute.manager [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Instance network_info: |[{"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4bd1090f-a122-44c6-9032-ca4ce1294dc5 req-41d0aa64-54ba-4f84-b960-272f0982dfd9 service nova] Acquired lock "refresh_cache-50cff6dc-1947-417a-8c5f-0b10de5dfd3a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.network.neutron [req-4bd1090f-a122-44c6-9032-ca4ce1294dc5 req-41d0aa64-54ba-4f84-b960-272f0982dfd9 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Refreshing network info cache for port 56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Start _get_guest_xml network_info=[{"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:31:55 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:31:55 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:31:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-325022623',display_name='tempest-ServerRescueNegativeTestJSON-server-325022623',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-325022623',id=12,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d6d8880a9263444cba94725a83974403',ramdisk_id='',reservation_id='r-5my5271p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-678567479',owner_user_name='tempest-ServerRescueNegativeTestJSON-678567479-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:31:53Z,user_data=None,user_id='729f2fdafa8e471e8f0de0c8323c36b5',uuid=50cff6dc-1947-417a-8c5f-0b10de5dfd3a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converting VIF {"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:53:84,bridge_name='br-int',has_traffic_filtering=True,id=56f36d99-9847-4d4d-bf9b-c1c2244bac79,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56f36d99-98') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.objects.instance [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lazy-loading 'pci_devices' on Instance uuid 50cff6dc-1947-417a-8c5f-0b10de5dfd3a {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] End _get_guest_xml xml= Apr 20 10:31:55 user nova-compute[71283]: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a Apr 20 10:31:55 user nova-compute[71283]: instance-0000000c Apr 20 10:31:55 user nova-compute[71283]: 131072 Apr 20 10:31:55 user nova-compute[71283]: 1 Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: tempest-ServerRescueNegativeTestJSON-server-325022623 Apr 20 10:31:55 user nova-compute[71283]: 2023-04-20 10:31:55 Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: 128 Apr 20 10:31:55 user nova-compute[71283]: 1 Apr 20 10:31:55 user nova-compute[71283]: 0 Apr 20 10:31:55 user nova-compute[71283]: 0 Apr 20 10:31:55 user nova-compute[71283]: 1 Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: tempest-ServerRescueNegativeTestJSON-678567479-project-member Apr 20 10:31:55 user nova-compute[71283]: tempest-ServerRescueNegativeTestJSON-678567479 Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: OpenStack Foundation Apr 20 10:31:55 user nova-compute[71283]: OpenStack Nova Apr 20 10:31:55 user nova-compute[71283]: 0.0.0 Apr 20 10:31:55 user nova-compute[71283]: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a Apr 20 10:31:55 user nova-compute[71283]: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a Apr 20 10:31:55 user nova-compute[71283]: Virtual Machine Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: hvm Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Nehalem Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: /dev/urandom Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: Apr 20 10:31:55 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:31:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-325022623',display_name='tempest-ServerRescueNegativeTestJSON-server-325022623',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-325022623',id=12,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d6d8880a9263444cba94725a83974403',ramdisk_id='',reservation_id='r-5my5271p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-678567479',owner_user_name='tempest-ServerRescueNegativeTestJSON-678567479-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:31:53Z,user_data=None,user_id='729f2fdafa8e471e8f0de0c8323c36b5',uuid=50cff6dc-1947-417a-8c5f-0b10de5dfd3a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converting VIF {"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:53:84,bridge_name='br-int',has_traffic_filtering=True,id=56f36d99-9847-4d4d-bf9b-c1c2244bac79,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56f36d99-98') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG os_vif [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:53:84,bridge_name='br-int',has_traffic_filtering=True,id=56f36d99-9847-4d4d-bf9b-c1c2244bac79,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56f36d99-98') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap56f36d99-98, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap56f36d99-98, col_values=(('external_ids', {'iface-id': '56f36d99-9847-4d4d-bf9b-c1c2244bac79', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c7:53:84', 'vm-uuid': '50cff6dc-1947-417a-8c5f-0b10de5dfd3a'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:55 user nova-compute[71283]: INFO os_vif [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:53:84,bridge_name='br-int',has_traffic_filtering=True,id=56f36d99-9847-4d4d-bf9b-c1c2244bac79,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56f36d99-98') Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] No VIF found with MAC fa:16:3e:c7:53:84, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:31:55 user nova-compute[71283]: DEBUG nova.network.neutron [req-4bd1090f-a122-44c6-9032-ca4ce1294dc5 req-41d0aa64-54ba-4f84-b960-272f0982dfd9 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Updated VIF entry in instance network info cache for port 56f36d99-9847-4d4d-bf9b-c1c2244bac79. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:31:56 user nova-compute[71283]: DEBUG nova.network.neutron [req-4bd1090f-a122-44c6-9032-ca4ce1294dc5 req-41d0aa64-54ba-4f84-b960-272f0982dfd9 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Updating instance_info_cache with network_info: [{"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:31:56 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4bd1090f-a122-44c6-9032-ca4ce1294dc5 req-41d0aa64-54ba-4f84-b960-272f0982dfd9 service nova] Releasing lock "refresh_cache-50cff6dc-1947-417a-8c5f-0b10de5dfd3a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:31:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:57 user nova-compute[71283]: DEBUG nova.compute.manager [req-98a2a1db-7934-41d0-a382-1c969950d35e req-0ce1c515-465a-4de2-b80f-1b627e8cab6f service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Received event network-vif-plugged-c47ac25a-c0d0-4443-8759-c573281b63f9 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-98a2a1db-7934-41d0-a382-1c969950d35e req-0ce1c515-465a-4de2-b80f-1b627e8cab6f service nova] Acquiring lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-98a2a1db-7934-41d0-a382-1c969950d35e req-0ce1c515-465a-4de2-b80f-1b627e8cab6f service nova] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-98a2a1db-7934-41d0-a382-1c969950d35e req-0ce1c515-465a-4de2-b80f-1b627e8cab6f service nova] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:57 user nova-compute[71283]: DEBUG nova.compute.manager [req-98a2a1db-7934-41d0-a382-1c969950d35e req-0ce1c515-465a-4de2-b80f-1b627e8cab6f service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] No waiting events found dispatching network-vif-plugged-c47ac25a-c0d0-4443-8759-c573281b63f9 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:31:57 user nova-compute[71283]: WARNING nova.compute.manager [req-98a2a1db-7934-41d0-a382-1c969950d35e req-0ce1c515-465a-4de2-b80f-1b627e8cab6f service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Received unexpected event network-vif-plugged-c47ac25a-c0d0-4443-8759-c573281b63f9 for instance with vm_state building and task_state spawning. Apr 20 10:31:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Received event network-vif-plugged-c47ac25a-c0d0-4443-8759-c573281b63f9 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] Acquiring lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] No waiting events found dispatching network-vif-plugged-c47ac25a-c0d0-4443-8759-c573281b63f9 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:31:59 user nova-compute[71283]: WARNING nova.compute.manager [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Received unexpected event network-vif-plugged-c47ac25a-c0d0-4443-8759-c573281b63f9 for instance with vm_state building and task_state spawning. Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received event network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] Acquiring lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] No waiting events found dispatching network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:31:59 user nova-compute[71283]: WARNING nova.compute.manager [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received unexpected event network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 for instance with vm_state building and task_state spawning. Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received event network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] Acquiring lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] No waiting events found dispatching network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:31:59 user nova-compute[71283]: WARNING nova.compute.manager [req-bac3e7be-5daa-404c-9a56-a06562b80e1a req-095de349-65b2-42f6-98b8-601dc8e2398d service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received unexpected event network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 for instance with vm_state building and task_state spawning. Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:31:59 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] VM Resumed (Lifecycle Event) Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:31:59 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Instance spawned successfully. Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:31:59 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Instance spawned successfully. Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:31:59 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:31:59 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] VM Resumed (Lifecycle Event) Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:31:59 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:31:59 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] VM Started (Lifecycle Event) Apr 20 10:31:59 user nova-compute[71283]: INFO nova.compute.manager [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Took 7.52 seconds to spawn the instance on the hypervisor. Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:31:59 user nova-compute[71283]: INFO nova.compute.manager [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Took 6.54 seconds to spawn the instance on the hypervisor. Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:31:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:32:00 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:32:00 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:32:00 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:32:00 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] VM Started (Lifecycle Event) Apr 20 10:32:00 user nova-compute[71283]: INFO nova.compute.manager [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Took 8.21 seconds to build instance. Apr 20 10:32:00 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:32:00 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:32:00 user nova-compute[71283]: INFO nova.compute.manager [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Took 7.37 seconds to build instance. Apr 20 10:32:00 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7bb71586-a39a-4b0f-907b-9989767a6845 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.319s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:00 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-653c649f-97ce-4e46-8952-2c0870fffaed tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.509s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:05 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:10 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:13 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:32:13 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:32:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Acquiring lock "f48e6aa1-dd33-42a4-89c9-20691b628c70" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:32:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Acquiring lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:14 user nova-compute[71283]: INFO nova.compute.manager [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Terminating instance Apr 20 10:32:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG nova.compute.manager [req-cfaf1dd7-33a6-4992-b8f2-3902b027f201 req-93964508-4bb8-451c-8867-b821f40c7d5f service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Received event network-vif-unplugged-0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-cfaf1dd7-33a6-4992-b8f2-3902b027f201 req-93964508-4bb8-451c-8867-b821f40c7d5f service nova] Acquiring lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-cfaf1dd7-33a6-4992-b8f2-3902b027f201 req-93964508-4bb8-451c-8867-b821f40c7d5f service nova] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-cfaf1dd7-33a6-4992-b8f2-3902b027f201 req-93964508-4bb8-451c-8867-b821f40c7d5f service nova] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG nova.compute.manager [req-cfaf1dd7-33a6-4992-b8f2-3902b027f201 req-93964508-4bb8-451c-8867-b821f40c7d5f service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] No waiting events found dispatching network-vif-unplugged-0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG nova.compute.manager [req-cfaf1dd7-33a6-4992-b8f2-3902b027f201 req-93964508-4bb8-451c-8867-b821f40c7d5f service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Received event network-vif-unplugged-0954983e-5cb4-4486-9816-67f4f5f78b35 for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:32:14 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Instance destroyed successfully. Apr 20 10:32:14 user nova-compute[71283]: DEBUG nova.objects.instance [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lazy-loading 'resources' on Instance uuid f48e6aa1-dd33-42a4-89c9-20691b628c70 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1817883104',display_name='tempest-ServerStableDeviceRescueTest-server-1817883104',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-1817883104',id=2,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFPer3atuApOKAbOrxgU77C03uvd3ClFPy3FRncu+A4hBfn7+Ligr5c8zlC1kkZihqe2/A/t/d8AFp1mDAF+J+oDrvVOTMmcabgcsFyIy7gKcmM4pBHM2RndymuOvmh30Q==',key_name='tempest-keypair-314135284',keypairs=,launch_index=0,launched_at=2023-04-20T10:28:38Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='2d1545b79af0497497e960cbd68aa8f2',ramdisk_id='',reservation_id='r-nvccs3w6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerStableDeviceRescueTest-1071968675',owner_user_name='tempest-ServerStableDeviceRescueTest-1071968675-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:30:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2aef9efd33b946709d8f01f41b79f382',uuid=f48e6aa1-dd33-42a4-89c9-20691b628c70,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0954983e-5cb4-4486-9816-67f4f5f78b35", "address": "fa:16:3e:dc:a8:8c", "network": {"id": "06131689-128d-43da-b721-a9dda3f2e19f", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1783672421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.122", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2d1545b79af0497497e960cbd68aa8f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0954983e-5c", "ovs_interfaceid": "0954983e-5cb4-4486-9816-67f4f5f78b35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Converting VIF {"id": "0954983e-5cb4-4486-9816-67f4f5f78b35", "address": "fa:16:3e:dc:a8:8c", "network": {"id": "06131689-128d-43da-b721-a9dda3f2e19f", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1783672421-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.122", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2d1545b79af0497497e960cbd68aa8f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0954983e-5c", "ovs_interfaceid": "0954983e-5cb4-4486-9816-67f4f5f78b35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dc:a8:8c,bridge_name='br-int',has_traffic_filtering=True,id=0954983e-5cb4-4486-9816-67f4f5f78b35,network=Network(06131689-128d-43da-b721-a9dda3f2e19f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0954983e-5c') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG os_vif [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:a8:8c,bridge_name='br-int',has_traffic_filtering=True,id=0954983e-5cb4-4486-9816-67f4f5f78b35,network=Network(06131689-128d-43da-b721-a9dda3f2e19f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0954983e-5c') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0954983e-5c, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:14 user nova-compute[71283]: INFO os_vif [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:a8:8c,bridge_name='br-int',has_traffic_filtering=True,id=0954983e-5cb4-4486-9816-67f4f5f78b35,network=Network(06131689-128d-43da-b721-a9dda3f2e19f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0954983e-5c') Apr 20 10:32:14 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Deleting instance files /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70_del Apr 20 10:32:14 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Deletion of /opt/stack/data/nova/instances/f48e6aa1-dd33-42a4-89c9-20691b628c70_del complete Apr 20 10:32:14 user nova-compute[71283]: INFO nova.compute.manager [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Took 0.87 seconds to destroy the instance on the hypervisor. Apr 20 10:32:14 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:32:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG nova.compute.manager [req-40292338-af2e-4fab-a540-36eeacc7c8d9 req-b39fd3b9-de42-4be8-928b-e0c5d2e53745 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Received event network-vif-deleted-0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:32:15 user nova-compute[71283]: INFO nova.compute.manager [req-40292338-af2e-4fab-a540-36eeacc7c8d9 req-b39fd3b9-de42-4be8-928b-e0c5d2e53745 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Neutron deleted interface 0954983e-5cb4-4486-9816-67f4f5f78b35; detaching it from the instance and deleting it from the info cache Apr 20 10:32:15 user nova-compute[71283]: DEBUG nova.network.neutron [req-40292338-af2e-4fab-a540-36eeacc7c8d9 req-b39fd3b9-de42-4be8-928b-e0c5d2e53745 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:32:15 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Took 0.97 seconds to deallocate network for instance. Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG nova.compute.manager [req-40292338-af2e-4fab-a540-36eeacc7c8d9 req-b39fd3b9-de42-4be8-928b-e0c5d2e53745 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Detach interface failed, port_id=0954983e-5cb4-4486-9816-67f4f5f78b35, reason: Instance f48e6aa1-dd33-42a4-89c9-20691b628c70 could not be found. {{(pid=71283) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:32:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.003s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG nova.compute.manager [req-59db6c53-a6da-4d8a-8f46-0f3cd8bb4da4 req-c47138a3-3d30-49d6-bbba-eff603bc7836 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Received event network-vif-plugged-0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-59db6c53-a6da-4d8a-8f46-0f3cd8bb4da4 req-c47138a3-3d30-49d6-bbba-eff603bc7836 service nova] Acquiring lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-59db6c53-a6da-4d8a-8f46-0f3cd8bb4da4 req-c47138a3-3d30-49d6-bbba-eff603bc7836 service nova] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-59db6c53-a6da-4d8a-8f46-0f3cd8bb4da4 req-c47138a3-3d30-49d6-bbba-eff603bc7836 service nova] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG nova.compute.manager [req-59db6c53-a6da-4d8a-8f46-0f3cd8bb4da4 req-c47138a3-3d30-49d6-bbba-eff603bc7836 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] No waiting events found dispatching network-vif-plugged-0954983e-5cb4-4486-9816-67f4f5f78b35 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:32:16 user nova-compute[71283]: WARNING nova.compute.manager [req-59db6c53-a6da-4d8a-8f46-0f3cd8bb4da4 req-c47138a3-3d30-49d6-bbba-eff603bc7836 service nova] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Received unexpected event network-vif-plugged-0954983e-5cb4-4486-9816-67f4f5f78b35 for instance with vm_state deleted and task_state None. Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.395s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json" returned: 0 in 0.166s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:16 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Deleted allocations for instance f48e6aa1-dd33-42a4-89c9-20691b628c70 Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4152143d-a0bf-46d4-a021-846818ab0d82 tempest-ServerStableDeviceRescueTest-1071968675 tempest-ServerStableDeviceRescueTest-1071968675-project-member] Lock "f48e6aa1-dd33-42a4-89c9-20691b628c70" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.495s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.170s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:32:17 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Error from libvirt while getting description of instance-00000002: [Error Code 42] Domain not found: no domain with matching uuid 'f48e6aa1-dd33-42a4-89c9-20691b628c70' (instance-00000002): libvirt.libvirtError: Domain not found: no domain with matching uuid 'f48e6aa1-dd33-42a4-89c9-20691b628c70' (instance-00000002) Apr 20 10:32:17 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:32:17 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:32:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=8239MB free_disk=26.49158477783203GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 64fb0d55-ef35-4386-86fe-00775b83a8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 9189f862-2e91-4420-ab64-54375c4f9466 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance db38a0ab-0165-480d-bddb-cab50bcd22e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance e1115ad2-b858-4859-b1bb-2175f7eab867 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 50cff6dc-1947-417a-8c5f-0b10de5dfd3a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 7 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=1408MB phys_disk=40GB used_disk=7GB total_vcpus=12 used_vcpus=7 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:32:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.358s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:19 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:32:19 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "db38a0ab-0165-480d-bddb-cab50bcd22e4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:20 user nova-compute[71283]: INFO nova.compute.manager [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Terminating instance Apr 20 10:32:20 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-9189f862-2e91-4420-ab64-54375c4f9466" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-9189f862-2e91-4420-ab64-54375c4f9466" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG nova.compute.manager [req-c3058e44-7c86-40f5-8238-bbc9bce2d30a req-e4d94549-6245-4f1e-a782-bfb6467552c7 service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Received event network-vif-unplugged-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-c3058e44-7c86-40f5-8238-bbc9bce2d30a req-e4d94549-6245-4f1e-a782-bfb6467552c7 service nova] Acquiring lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-c3058e44-7c86-40f5-8238-bbc9bce2d30a req-e4d94549-6245-4f1e-a782-bfb6467552c7 service nova] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-c3058e44-7c86-40f5-8238-bbc9bce2d30a req-e4d94549-6245-4f1e-a782-bfb6467552c7 service nova] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG nova.compute.manager [req-c3058e44-7c86-40f5-8238-bbc9bce2d30a req-e4d94549-6245-4f1e-a782-bfb6467552c7 service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] No waiting events found dispatching network-vif-unplugged-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG nova.compute.manager [req-c3058e44-7c86-40f5-8238-bbc9bce2d30a req-e4d94549-6245-4f1e-a782-bfb6467552c7 service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Received event network-vif-unplugged-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:20 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Instance destroyed successfully. Apr 20 10:32:20 user nova-compute[71283]: DEBUG nova.objects.instance [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lazy-loading 'resources' on Instance uuid db38a0ab-0165-480d-bddb-cab50bcd22e4 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:30:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1784158863',display_name='tempest-VolumesAdminNegativeTest-server-1784158863',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1784158863',id=8,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T10:30:34Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='7c3ecc1463ec42eea56f2890b032ef7a',ramdisk_id='',reservation_id='r-9l2ew4yx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-2134667240',owner_user_name='tempest-VolumesAdminNegativeTest-2134667240-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:30:34Z,user_data=None,user_id='712be6d6876f4b2c9d796e406a43f8bf',uuid=db38a0ab-0165-480d-bddb-cab50bcd22e4,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "address": "fa:16:3e:21:d2:da", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d16f167-ea", "ovs_interfaceid": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Converting VIF {"id": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "address": "fa:16:3e:21:d2:da", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d16f167-ea", "ovs_interfaceid": "7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:d2:da,bridge_name='br-int',has_traffic_filtering=True,id=7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d16f167-ea') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG os_vif [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:d2:da,bridge_name='br-int',has_traffic_filtering=True,id=7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d16f167-ea') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d16f167-ea, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:32:20 user nova-compute[71283]: INFO os_vif [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:d2:da,bridge_name='br-int',has_traffic_filtering=True,id=7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d16f167-ea') Apr 20 10:32:20 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Deleting instance files /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4_del Apr 20 10:32:20 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Deletion of /opt/stack/data/nova/instances/db38a0ab-0165-480d-bddb-cab50bcd22e4_del complete Apr 20 10:32:20 user nova-compute[71283]: INFO nova.compute.manager [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 20 10:32:20 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:32:20 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:21 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Updating instance_info_cache with network_info: [{"id": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "address": "fa:16:3e:39:2f:ad", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.2", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1a6c29-6d", "ovs_interfaceid": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:32:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-9189f862-2e91-4420-ab64-54375c4f9466" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:32:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:32:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:32:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:32:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:32:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:32:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:32:21 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:32:21 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Took 0.63 seconds to deallocate network for instance. Apr 20 10:32:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:32:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:32:21 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:32:21 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:32:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.254s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:21 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Deleted allocations for instance db38a0ab-0165-480d-bddb-cab50bcd22e4 Apr 20 10:32:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e455f7fb-1464-41f0-875a-95e103abdf70 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.731s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:22 user nova-compute[71283]: DEBUG nova.compute.manager [req-83711cf6-698c-494d-91db-4b9851ce9d5a req-e9f72315-aaea-4bff-94d6-379e44c2465c service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Received event network-vif-plugged-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:32:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-83711cf6-698c-494d-91db-4b9851ce9d5a req-e9f72315-aaea-4bff-94d6-379e44c2465c service nova] Acquiring lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:32:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-83711cf6-698c-494d-91db-4b9851ce9d5a req-e9f72315-aaea-4bff-94d6-379e44c2465c service nova] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:32:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-83711cf6-698c-494d-91db-4b9851ce9d5a req-e9f72315-aaea-4bff-94d6-379e44c2465c service nova] Lock "db38a0ab-0165-480d-bddb-cab50bcd22e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:32:22 user nova-compute[71283]: DEBUG nova.compute.manager [req-83711cf6-698c-494d-91db-4b9851ce9d5a req-e9f72315-aaea-4bff-94d6-379e44c2465c service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] No waiting events found dispatching network-vif-plugged-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:32:22 user nova-compute[71283]: WARNING nova.compute.manager [req-83711cf6-698c-494d-91db-4b9851ce9d5a req-e9f72315-aaea-4bff-94d6-379e44c2465c service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Received unexpected event network-vif-plugged-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee for instance with vm_state deleted and task_state None. Apr 20 10:32:22 user nova-compute[71283]: DEBUG nova.compute.manager [req-83711cf6-698c-494d-91db-4b9851ce9d5a req-e9f72315-aaea-4bff-94d6-379e44c2465c service nova] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Received event network-vif-deleted-7d16f167-ea3c-4ac0-b8a1-2f4616bef2ee {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:32:23 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:32:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:29 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:32:29 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] VM Stopped (Lifecycle Event) Apr 20 10:32:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-c7fbd5ad-309d-4ae5-a2d2-377d17b67054 None None] [instance: f48e6aa1-dd33-42a4-89c9-20691b628c70] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:32:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:35 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:32:35 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] VM Stopped (Lifecycle Event) Apr 20 10:32:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-268e673c-ed3a-4dbf-bb5a-b38f001ae151 None None] [instance: db38a0ab-0165-480d-bddb-cab50bcd22e4] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:32:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:42 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:50 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:32:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:05 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:07 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:09 user nova-compute[71283]: DEBUG nova.compute.manager [req-80b280af-31d7-4c11-8c15-77bda9d5eb91 req-8d468b19-1cca-44c1-a66e-53b86a7def57 service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Received event network-changed-3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:09 user nova-compute[71283]: DEBUG nova.compute.manager [req-80b280af-31d7-4c11-8c15-77bda9d5eb91 req-8d468b19-1cca-44c1-a66e-53b86a7def57 service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Refreshing instance network info cache due to event network-changed-3a9efede-078b-4adc-9b72-465253f4444d. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:33:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-80b280af-31d7-4c11-8c15-77bda9d5eb91 req-8d468b19-1cca-44c1-a66e-53b86a7def57 service nova] Acquiring lock "refresh_cache-afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:33:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-80b280af-31d7-4c11-8c15-77bda9d5eb91 req-8d468b19-1cca-44c1-a66e-53b86a7def57 service nova] Acquired lock "refresh_cache-afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:33:09 user nova-compute[71283]: DEBUG nova.network.neutron [req-80b280af-31d7-4c11-8c15-77bda9d5eb91 req-8d468b19-1cca-44c1-a66e-53b86a7def57 service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Refreshing network info cache for port 3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:33:10 user nova-compute[71283]: DEBUG nova.network.neutron [req-80b280af-31d7-4c11-8c15-77bda9d5eb91 req-8d468b19-1cca-44c1-a66e-53b86a7def57 service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Updated VIF entry in instance network info cache for port 3a9efede-078b-4adc-9b72-465253f4444d. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:33:10 user nova-compute[71283]: DEBUG nova.network.neutron [req-80b280af-31d7-4c11-8c15-77bda9d5eb91 req-8d468b19-1cca-44c1-a66e-53b86a7def57 service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Updating instance_info_cache with network_info: [{"id": "3a9efede-078b-4adc-9b72-465253f4444d", "address": "fa:16:3e:9b:f8:4f", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.7", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a9efede-07", "ovs_interfaceid": "3a9efede-078b-4adc-9b72-465253f4444d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:33:10 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-80b280af-31d7-4c11-8c15-77bda9d5eb91 req-8d468b19-1cca-44c1-a66e-53b86a7def57 service nova] Releasing lock "refresh_cache-afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "9189f862-2e91-4420-ab64-54375c4f9466" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "9189f862-2e91-4420-ab64-54375c4f9466" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "9189f862-2e91-4420-ab64-54375c4f9466-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:11 user nova-compute[71283]: INFO nova.compute.manager [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Terminating instance Apr 20 10:33:11 user nova-compute[71283]: DEBUG nova.compute.manager [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:33:11 user nova-compute[71283]: INFO nova.compute.manager [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Terminating instance Apr 20 10:33:11 user nova-compute[71283]: DEBUG nova.compute.manager [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG nova.compute.manager [req-9b6dcf8e-8224-4687-a438-6ded9597028d req-c304457a-ded4-4a39-b12b-d87f4b6d2d56 service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Received event network-vif-unplugged-3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9b6dcf8e-8224-4687-a438-6ded9597028d req-c304457a-ded4-4a39-b12b-d87f4b6d2d56 service nova] Acquiring lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9b6dcf8e-8224-4687-a438-6ded9597028d req-c304457a-ded4-4a39-b12b-d87f4b6d2d56 service nova] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9b6dcf8e-8224-4687-a438-6ded9597028d req-c304457a-ded4-4a39-b12b-d87f4b6d2d56 service nova] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG nova.compute.manager [req-9b6dcf8e-8224-4687-a438-6ded9597028d req-c304457a-ded4-4a39-b12b-d87f4b6d2d56 service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] No waiting events found dispatching network-vif-unplugged-3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG nova.compute.manager [req-9b6dcf8e-8224-4687-a438-6ded9597028d req-c304457a-ded4-4a39-b12b-d87f4b6d2d56 service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Received event network-vif-unplugged-3a9efede-078b-4adc-9b72-465253f4444d for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG nova.compute.manager [req-1f969519-b19f-4d1d-af63-52043a134689 req-4d11873f-03b0-457e-be0d-3dc6d166dfff service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received event network-vif-unplugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1f969519-b19f-4d1d-af63-52043a134689 req-4d11873f-03b0-457e-be0d-3dc6d166dfff service nova] Acquiring lock "9189f862-2e91-4420-ab64-54375c4f9466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1f969519-b19f-4d1d-af63-52043a134689 req-4d11873f-03b0-457e-be0d-3dc6d166dfff service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-1f969519-b19f-4d1d-af63-52043a134689 req-4d11873f-03b0-457e-be0d-3dc6d166dfff service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG nova.compute.manager [req-1f969519-b19f-4d1d-af63-52043a134689 req-4d11873f-03b0-457e-be0d-3dc6d166dfff service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] No waiting events found dispatching network-vif-unplugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG nova.compute.manager [req-1f969519-b19f-4d1d-af63-52043a134689 req-4d11873f-03b0-457e-be0d-3dc6d166dfff service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received event network-vif-unplugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:12 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Instance destroyed successfully. Apr 20 10:33:12 user nova-compute[71283]: DEBUG nova.objects.instance [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lazy-loading 'resources' on Instance uuid 9189f862-2e91-4420-ab64-54375c4f9466 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-42507989',display_name='tempest-VolumesAdminNegativeTest-server-42507989',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-42507989',id=5,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKo6j6WbCBGkhY+/90E7m8dSLZQNugKiG3Ze4NFSIzNr6U7x488jnMXjls3Or9lOdNpk237JBBGIDaL4rNwHM9ccpiAVRM3eozuQ2DPkyUdqXkWu62GoK7gs5ZQh/CHghQ==',key_name='tempest-keypair-1241550354',keypairs=,launch_index=0,launched_at=2023-04-20T10:28:47Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='7c3ecc1463ec42eea56f2890b032ef7a',ramdisk_id='',reservation_id='r-5fxeobjd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-2134667240',owner_user_name='tempest-VolumesAdminNegativeTest-2134667240-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:28:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='712be6d6876f4b2c9d796e406a43f8bf',uuid=9189f862-2e91-4420-ab64-54375c4f9466,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "address": "fa:16:3e:39:2f:ad", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.2", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1a6c29-6d", "ovs_interfaceid": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Converting VIF {"id": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "address": "fa:16:3e:39:2f:ad", "network": {"id": "55052d2c-5064-4bd3-9b46-6e90a5ee3def", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1948401123-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.2", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7c3ecc1463ec42eea56f2890b032ef7a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1a6c29-6d", "ovs_interfaceid": "ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:2f:ad,bridge_name='br-int',has_traffic_filtering=True,id=ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1a6c29-6d') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG os_vif [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:2f:ad,bridge_name='br-int',has_traffic_filtering=True,id=ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1a6c29-6d') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab1a6c29-6d, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:12 user nova-compute[71283]: INFO os_vif [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:2f:ad,bridge_name='br-int',has_traffic_filtering=True,id=ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd,network=Network(55052d2c-5064-4bd3-9b46-6e90a5ee3def),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1a6c29-6d') Apr 20 10:33:12 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Deleting instance files /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466_del Apr 20 10:33:12 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Deletion of /opt/stack/data/nova/instances/9189f862-2e91-4420-ab64-54375c4f9466_del complete Apr 20 10:33:12 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Instance destroyed successfully. Apr 20 10:33:12 user nova-compute[71283]: DEBUG nova.objects.instance [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lazy-loading 'resources' on Instance uuid afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:31:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1492863401',display_name='tempest-AttachVolumeTestJSON-server-1492863401',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1492863401',id=9,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCJ5hWqMeQoiTjX1acBo0XOFkMSjsm4q8qCy1ebdaywrtaD1l+xT9hnnoXuRjhZSD6F7eyAs+ruYJLSMyK+7gGsfIukb8brNhWiGX5TJye8A1hd9hjeQcPgsM9lbt6bNyw==',key_name='tempest-keypair-262941358',keypairs=,launch_index=0,launched_at=2023-04-20T10:31:26Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1b4a2af680394ec889b4661753658b01',ramdisk_id='',reservation_id='r-z59spi1v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-1715839687',owner_user_name='tempest-AttachVolumeTestJSON-1715839687-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:31:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='67955d2e81c04b8d8dbcbe577303e025',uuid=afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3a9efede-078b-4adc-9b72-465253f4444d", "address": "fa:16:3e:9b:f8:4f", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.7", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a9efede-07", "ovs_interfaceid": "3a9efede-078b-4adc-9b72-465253f4444d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Converting VIF {"id": "3a9efede-078b-4adc-9b72-465253f4444d", "address": "fa:16:3e:9b:f8:4f", "network": {"id": "a64312c6-b1a8-450f-9e8d-0cf0282aef90", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-329219891-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.7", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1b4a2af680394ec889b4661753658b01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a9efede-07", "ovs_interfaceid": "3a9efede-078b-4adc-9b72-465253f4444d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9b:f8:4f,bridge_name='br-int',has_traffic_filtering=True,id=3a9efede-078b-4adc-9b72-465253f4444d,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a9efede-07') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG os_vif [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:f8:4f,bridge_name='br-int',has_traffic_filtering=True,id=3a9efede-078b-4adc-9b72-465253f4444d,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a9efede-07') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a9efede-07, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:33:12 user nova-compute[71283]: INFO os_vif [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:f8:4f,bridge_name='br-int',has_traffic_filtering=True,id=3a9efede-078b-4adc-9b72-465253f4444d,network=Network(a64312c6-b1a8-450f-9e8d-0cf0282aef90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a9efede-07') Apr 20 10:33:12 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Deleting instance files /opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af_del Apr 20 10:33:12 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Deletion of /opt/stack/data/nova/instances/afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af_del complete Apr 20 10:33:12 user nova-compute[71283]: INFO nova.compute.manager [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Took 1.22 seconds to destroy the instance on the hypervisor. Apr 20 10:33:12 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:33:12 user nova-compute[71283]: INFO nova.compute.manager [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Took 1.24 seconds to destroy the instance on the hypervisor. Apr 20 10:33:12 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:33:13 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Took 0.93 seconds to deallocate network for instance. Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:13 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Took 1.06 seconds to deallocate network for instance. Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-c099b05f-6ae4-4db7-b182-e96f063efbd1 req-564031c2-0cc3-4809-a022-3270d0b54644 service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received event network-vif-deleted-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-4d8f6b85-d2e4-43b8-bb93-b1494f925058 req-7914be0e-65df-4afc-88b8-9f4c9e3a2c3d service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Received event network-vif-plugged-3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4d8f6b85-d2e4-43b8-bb93-b1494f925058 req-7914be0e-65df-4afc-88b8-9f4c9e3a2c3d service nova] Acquiring lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4d8f6b85-d2e4-43b8-bb93-b1494f925058 req-7914be0e-65df-4afc-88b8-9f4c9e3a2c3d service nova] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4d8f6b85-d2e4-43b8-bb93-b1494f925058 req-7914be0e-65df-4afc-88b8-9f4c9e3a2c3d service nova] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-4d8f6b85-d2e4-43b8-bb93-b1494f925058 req-7914be0e-65df-4afc-88b8-9f4c9e3a2c3d service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] No waiting events found dispatching network-vif-plugged-3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:13 user nova-compute[71283]: WARNING nova.compute.manager [req-4d8f6b85-d2e4-43b8-bb93-b1494f925058 req-7914be0e-65df-4afc-88b8-9f4c9e3a2c3d service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Received unexpected event network-vif-plugged-3a9efede-078b-4adc-9b72-465253f4444d for instance with vm_state deleted and task_state None. Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-4d8f6b85-d2e4-43b8-bb93-b1494f925058 req-7914be0e-65df-4afc-88b8-9f4c9e3a2c3d service nova] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Received event network-vif-deleted-3a9efede-078b-4adc-9b72-465253f4444d {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.255s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.173s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:13 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Deleted allocations for instance afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received event network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Acquiring lock "9189f862-2e91-4420-ab64-54375c4f9466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] No waiting events found dispatching network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:13 user nova-compute[71283]: WARNING nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received unexpected event network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd for instance with vm_state deleted and task_state None. Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received event network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Acquiring lock "9189f862-2e91-4420-ab64-54375c4f9466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] No waiting events found dispatching network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:13 user nova-compute[71283]: WARNING nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received unexpected event network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd for instance with vm_state deleted and task_state None. Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received event network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Acquiring lock "9189f862-2e91-4420-ab64-54375c4f9466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] No waiting events found dispatching network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:13 user nova-compute[71283]: WARNING nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received unexpected event network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd for instance with vm_state deleted and task_state None. Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received event network-vif-unplugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Acquiring lock "9189f862-2e91-4420-ab64-54375c4f9466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] No waiting events found dispatching network-vif-unplugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:13 user nova-compute[71283]: WARNING nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received unexpected event network-vif-unplugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd for instance with vm_state deleted and task_state None. Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received event network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Acquiring lock "9189f862-2e91-4420-ab64-54375c4f9466-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] Lock "9189f862-2e91-4420-ab64-54375c4f9466-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] No waiting events found dispatching network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:13 user nova-compute[71283]: WARNING nova.compute.manager [req-0384699a-c801-4c9d-be2c-7f92eaeec546 req-bdec3fc9-d1b8-4924-a62a-c4853d36b63e service nova] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Received unexpected event network-vif-plugged-ab1a6c29-6d99-4a73-b93f-4c4e3b8871bd for instance with vm_state deleted and task_state None. Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-02a2ac9a-bd67-41d0-ad1f-a5e23263c635 tempest-AttachVolumeTestJSON-1715839687 tempest-AttachVolumeTestJSON-1715839687-project-member] Lock "afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.650s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.234s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:13 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Deleted allocations for instance 9189f862-2e91-4420-ab64-54375c4f9466 Apr 20 10:33:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-065d39c6-0121-4e0c-bd6a-e069ead40780 tempest-VolumesAdminNegativeTest-2134667240 tempest-VolumesAdminNegativeTest-2134667240-project-member] Lock "9189f862-2e91-4420-ab64-54375c4f9466" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.873s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:14 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:33:15 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:33:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:15 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:33:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:33:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:33:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:33:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:33:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:33:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:33:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:33:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:33:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:33:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:33:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:33:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:33:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:33:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:33:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:17 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:33:17 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:33:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=8709MB free_disk=26.557289123535156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 64fb0d55-ef35-4386-86fe-00775b83a8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance e1115ad2-b858-4859-b1bb-2175f7eab867 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 50cff6dc-1947-417a-8c5f-0b10de5dfd3a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:33:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.268s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:33:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:33:18 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:33:18 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Didn't find any instances for network info cache update. {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 10:33:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:33:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:33:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:33:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:33:18 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:33:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:33:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:27 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:33:27 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] VM Stopped (Lifecycle Event) Apr 20 10:33:27 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08cf307a-b1ba-4b17-baa0-9be8f03d4dd2 None None] [instance: 9189f862-2e91-4420-ab64-54375c4f9466] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:33:27 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:33:27 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] VM Stopped (Lifecycle Event) Apr 20 10:33:27 user nova-compute[71283]: DEBUG nova.compute.manager [None req-0d86984f-d814-43da-897a-c6e7a162f684 None None] [instance: afd69aaa-b7e6-4ae5-a4ef-4b7da89c84af] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:33:27 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:33 user nova-compute[71283]: DEBUG nova.compute.manager [req-08de60a5-6d8c-4f11-bca4-17e54603dbeb req-0996c050-1193-42bc-a352-a6df24aeec4f service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Received event network-changed-f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:33 user nova-compute[71283]: DEBUG nova.compute.manager [req-08de60a5-6d8c-4f11-bca4-17e54603dbeb req-0996c050-1193-42bc-a352-a6df24aeec4f service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Refreshing instance network info cache due to event network-changed-f6cef929-1703-4207-bef4-18cfa6e9f6fc. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:33:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-08de60a5-6d8c-4f11-bca4-17e54603dbeb req-0996c050-1193-42bc-a352-a6df24aeec4f service nova] Acquiring lock "refresh_cache-319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:33:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-08de60a5-6d8c-4f11-bca4-17e54603dbeb req-0996c050-1193-42bc-a352-a6df24aeec4f service nova] Acquired lock "refresh_cache-319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:33:33 user nova-compute[71283]: DEBUG nova.network.neutron [req-08de60a5-6d8c-4f11-bca4-17e54603dbeb req-0996c050-1193-42bc-a352-a6df24aeec4f service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Refreshing network info cache for port f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:33:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:34 user nova-compute[71283]: DEBUG nova.network.neutron [req-08de60a5-6d8c-4f11-bca4-17e54603dbeb req-0996c050-1193-42bc-a352-a6df24aeec4f service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Updated VIF entry in instance network info cache for port f6cef929-1703-4207-bef4-18cfa6e9f6fc. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:33:34 user nova-compute[71283]: DEBUG nova.network.neutron [req-08de60a5-6d8c-4f11-bca4-17e54603dbeb req-0996c050-1193-42bc-a352-a6df24aeec4f service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Updating instance_info_cache with network_info: [{"id": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "address": "fa:16:3e:cd:ca:12", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.127", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6cef929-17", "ovs_interfaceid": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:33:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-08de60a5-6d8c-4f11-bca4-17e54603dbeb req-0996c050-1193-42bc-a352-a6df24aeec4f service nova] Releasing lock "refresh_cache-319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:35 user nova-compute[71283]: INFO nova.compute.manager [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Terminating instance Apr 20 10:33:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG nova.compute.manager [req-e85e3cb1-a244-4822-b8e8-406c4cc0053c req-cbf2c42b-733d-49fa-9000-148f094380bc service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Received event network-vif-unplugged-f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-e85e3cb1-a244-4822-b8e8-406c4cc0053c req-cbf2c42b-733d-49fa-9000-148f094380bc service nova] Acquiring lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-e85e3cb1-a244-4822-b8e8-406c4cc0053c req-cbf2c42b-733d-49fa-9000-148f094380bc service nova] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-e85e3cb1-a244-4822-b8e8-406c4cc0053c req-cbf2c42b-733d-49fa-9000-148f094380bc service nova] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG nova.compute.manager [req-e85e3cb1-a244-4822-b8e8-406c4cc0053c req-cbf2c42b-733d-49fa-9000-148f094380bc service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] No waiting events found dispatching network-vif-unplugged-f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG nova.compute.manager [req-e85e3cb1-a244-4822-b8e8-406c4cc0053c req-cbf2c42b-733d-49fa-9000-148f094380bc service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Received event network-vif-unplugged-f6cef929-1703-4207-bef4-18cfa6e9f6fc for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:33:35 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Instance destroyed successfully. Apr 20 10:33:35 user nova-compute[71283]: DEBUG nova.objects.instance [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lazy-loading 'resources' on Instance uuid 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:31:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1210707222',display_name='tempest-AttachVolumeNegativeTest-server-1210707222',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1210707222',id=10,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD+9jK7t3ClieA+FSlSLflLYtwuLOAFMxwNPVMQvmnWXsj1kwhJZoZbs5/2/0cGkEuG79cH7XZnatiE6eDZhPpV+eOa2IU0p9RSwsraobttdhAQWALDgd6b3pAqIqOw/YA==',key_name='tempest-keypair-859145494',keypairs=,launch_index=0,launched_at=2023-04-20T10:31:47Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='12e621999b00481c839affc4e83ce37c',ramdisk_id='',reservation_id='r-7pfhccuf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-1619866573',owner_user_name='tempest-AttachVolumeNegativeTest-1619866573-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:31:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a04be58cd354d508616edd9d5eeff54',uuid=319eafd3-e6ac-4fcc-92b1-a5f4e60952e8,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "address": "fa:16:3e:cd:ca:12", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.127", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6cef929-17", "ovs_interfaceid": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converting VIF {"id": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "address": "fa:16:3e:cd:ca:12", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.127", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6cef929-17", "ovs_interfaceid": "f6cef929-1703-4207-bef4-18cfa6e9f6fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cd:ca:12,bridge_name='br-int',has_traffic_filtering=True,id=f6cef929-1703-4207-bef4-18cfa6e9f6fc,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6cef929-17') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG os_vif [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:ca:12,bridge_name='br-int',has_traffic_filtering=True,id=f6cef929-1703-4207-bef4-18cfa6e9f6fc,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6cef929-17') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6cef929-17, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:35 user nova-compute[71283]: INFO os_vif [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:ca:12,bridge_name='br-int',has_traffic_filtering=True,id=f6cef929-1703-4207-bef4-18cfa6e9f6fc,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6cef929-17') Apr 20 10:33:35 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Deleting instance files /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8_del Apr 20 10:33:35 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Deletion of /opt/stack/data/nova/instances/319eafd3-e6ac-4fcc-92b1-a5f4e60952e8_del complete Apr 20 10:33:35 user nova-compute[71283]: INFO nova.compute.manager [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 20 10:33:35 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:33:35 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:33:37 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:33:37 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Took 1.60 seconds to deallocate network for instance. Apr 20 10:33:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:37 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:33:37 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:33:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.175s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:37 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Deleted allocations for instance 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8 Apr 20 10:33:37 user nova-compute[71283]: DEBUG nova.compute.manager [req-72c1b078-2a81-4e5f-88d6-ac6a869d29f0 req-75e9eb78-7253-4cf1-9f7d-7080cf0c6337 service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Received event network-vif-plugged-f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-72c1b078-2a81-4e5f-88d6-ac6a869d29f0 req-75e9eb78-7253-4cf1-9f7d-7080cf0c6337 service nova] Acquiring lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-72c1b078-2a81-4e5f-88d6-ac6a869d29f0 req-75e9eb78-7253-4cf1-9f7d-7080cf0c6337 service nova] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-72c1b078-2a81-4e5f-88d6-ac6a869d29f0 req-75e9eb78-7253-4cf1-9f7d-7080cf0c6337 service nova] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:37 user nova-compute[71283]: DEBUG nova.compute.manager [req-72c1b078-2a81-4e5f-88d6-ac6a869d29f0 req-75e9eb78-7253-4cf1-9f7d-7080cf0c6337 service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] No waiting events found dispatching network-vif-plugged-f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:37 user nova-compute[71283]: WARNING nova.compute.manager [req-72c1b078-2a81-4e5f-88d6-ac6a869d29f0 req-75e9eb78-7253-4cf1-9f7d-7080cf0c6337 service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Received unexpected event network-vif-plugged-f6cef929-1703-4207-bef4-18cfa6e9f6fc for instance with vm_state deleted and task_state None. Apr 20 10:33:37 user nova-compute[71283]: DEBUG nova.compute.manager [req-72c1b078-2a81-4e5f-88d6-ac6a869d29f0 req-75e9eb78-7253-4cf1-9f7d-7080cf0c6337 service nova] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Received event network-vif-deleted-f6cef929-1703-4207-bef4-18cfa6e9f6fc {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-fcaf0e4c-85d2-4045-ba1e-2511d88972ef tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "319eafd3-e6ac-4fcc-92b1-a5f4e60952e8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.804s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:42 user nova-compute[71283]: INFO nova.compute.manager [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Rescuing Apr 20 10:33:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "refresh_cache-50cff6dc-1947-417a-8c5f-0b10de5dfd3a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:33:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquired lock "refresh_cache-50cff6dc-1947-417a-8c5f-0b10de5dfd3a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:33:42 user nova-compute[71283]: DEBUG nova.network.neutron [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:33:43 user nova-compute[71283]: DEBUG nova.network.neutron [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Updating instance_info_cache with network_info: [{"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:33:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Releasing lock "refresh_cache-50cff6dc-1947-417a-8c5f-0b10de5dfd3a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:33:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.compute.manager [req-cba27ae4-2ead-4b0b-bed6-ca4eaf5cc55a req-284cf5b5-0267-4f43-9380-d56837beadcc service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received event network-vif-unplugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-cba27ae4-2ead-4b0b-bed6-ca4eaf5cc55a req-284cf5b5-0267-4f43-9380-d56837beadcc service nova] Acquiring lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-cba27ae4-2ead-4b0b-bed6-ca4eaf5cc55a req-284cf5b5-0267-4f43-9380-d56837beadcc service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-cba27ae4-2ead-4b0b-bed6-ca4eaf5cc55a req-284cf5b5-0267-4f43-9380-d56837beadcc service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.compute.manager [req-cba27ae4-2ead-4b0b-bed6-ca4eaf5cc55a req-284cf5b5-0267-4f43-9380-d56837beadcc service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] No waiting events found dispatching network-vif-unplugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:44 user nova-compute[71283]: WARNING nova.compute.manager [req-cba27ae4-2ead-4b0b-bed6-ca4eaf5cc55a req-284cf5b5-0267-4f43-9380-d56837beadcc service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received unexpected event network-vif-unplugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 for instance with vm_state active and task_state rescuing. Apr 20 10:33:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:44 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Instance destroyed successfully. Apr 20 10:33:44 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Attempting rescue Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} {{(pid=71283) rescue /opt/stack/nova/nova/virt/libvirt/driver.py:4289}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Instance directory exists: not creating {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4694}} Apr 20 10:33:44 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Creating image(s) Apr 20 10:33:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "/opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "/opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "/opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.objects.instance [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lazy-loading 'trusted_certs' on Instance uuid 50cff6dc-1947-417a-8c5f-0b10de5dfd3a {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.144s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue" returned: 0 in 0.047s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.197s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.objects.instance [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lazy-loading 'migration_context' on Instance uuid 50cff6dc-1947-417a-8c5f-0b10de5dfd3a {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Start _get_guest_xml network_info=[{"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "vif_mac": "fa:16:3e:c7:53:84"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue={'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.objects.instance [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lazy-loading 'resources' on Instance uuid 50cff6dc-1947-417a-8c5f-0b10de5dfd3a {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.objects.instance [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lazy-loading 'numa_topology' on Instance uuid 50cff6dc-1947-417a-8c5f-0b10de5dfd3a {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:33:44 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:33:44 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.objects.instance [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lazy-loading 'vcpu_model' on Instance uuid 50cff6dc-1947-417a-8c5f-0b10de5dfd3a {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:31:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-325022623',display_name='tempest-ServerRescueNegativeTestJSON-server-325022623',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-325022623',id=12,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T10:32:00Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d6d8880a9263444cba94725a83974403',ramdisk_id='',reservation_id='r-5my5271p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-678567479',owner_user_name='tempest-ServerRescueNegativeTestJSON-678567479-project-member'},tags=,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:32:00Z,user_data=None,user_id='729f2fdafa8e471e8f0de0c8323c36b5',uuid=50cff6dc-1947-417a-8c5f-0b10de5dfd3a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "vif_mac": "fa:16:3e:c7:53:84"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converting VIF {"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "vif_mac": "fa:16:3e:c7:53:84"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c7:53:84,bridge_name='br-int',has_traffic_filtering=True,id=56f36d99-9847-4d4d-bf9b-c1c2244bac79,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56f36d99-98') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.objects.instance [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lazy-loading 'pci_devices' on Instance uuid 50cff6dc-1947-417a-8c5f-0b10de5dfd3a {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] End _get_guest_xml xml= Apr 20 10:33:44 user nova-compute[71283]: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a Apr 20 10:33:44 user nova-compute[71283]: instance-0000000c Apr 20 10:33:44 user nova-compute[71283]: 131072 Apr 20 10:33:44 user nova-compute[71283]: 1 Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: tempest-ServerRescueNegativeTestJSON-server-325022623 Apr 20 10:33:44 user nova-compute[71283]: 2023-04-20 10:33:44 Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: 128 Apr 20 10:33:44 user nova-compute[71283]: 1 Apr 20 10:33:44 user nova-compute[71283]: 0 Apr 20 10:33:44 user nova-compute[71283]: 0 Apr 20 10:33:44 user nova-compute[71283]: 1 Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: tempest-ServerRescueNegativeTestJSON-678567479-project-member Apr 20 10:33:44 user nova-compute[71283]: tempest-ServerRescueNegativeTestJSON-678567479 Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: OpenStack Foundation Apr 20 10:33:44 user nova-compute[71283]: OpenStack Nova Apr 20 10:33:44 user nova-compute[71283]: 0.0.0 Apr 20 10:33:44 user nova-compute[71283]: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a Apr 20 10:33:44 user nova-compute[71283]: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a Apr 20 10:33:44 user nova-compute[71283]: Virtual Machine Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: hvm Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Nehalem Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: /dev/urandom Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: Apr 20 10:33:44 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:33:44 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Instance destroyed successfully. Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] No BDM found with device name vdb, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:33:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] No VIF found with MAC fa:16:3e:c7:53:84, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:33:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:46 user nova-compute[71283]: DEBUG nova.compute.manager [req-5cf3fbd2-2dd1-43a5-9f7b-53385fda9ed9 req-2a24a8a9-d79d-4de3-87f7-5521231b60cb service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received event network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:46 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-5cf3fbd2-2dd1-43a5-9f7b-53385fda9ed9 req-2a24a8a9-d79d-4de3-87f7-5521231b60cb service nova] Acquiring lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:46 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-5cf3fbd2-2dd1-43a5-9f7b-53385fda9ed9 req-2a24a8a9-d79d-4de3-87f7-5521231b60cb service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:46 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-5cf3fbd2-2dd1-43a5-9f7b-53385fda9ed9 req-2a24a8a9-d79d-4de3-87f7-5521231b60cb service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:46 user nova-compute[71283]: DEBUG nova.compute.manager [req-5cf3fbd2-2dd1-43a5-9f7b-53385fda9ed9 req-2a24a8a9-d79d-4de3-87f7-5521231b60cb service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] No waiting events found dispatching network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:46 user nova-compute[71283]: WARNING nova.compute.manager [req-5cf3fbd2-2dd1-43a5-9f7b-53385fda9ed9 req-2a24a8a9-d79d-4de3-87f7-5521231b60cb service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received unexpected event network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 for instance with vm_state active and task_state rescuing. Apr 20 10:33:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG nova.compute.manager [req-910ffff1-2de0-432c-a70e-30a75c499679 req-8393d1ae-8ae7-42a9-b933-d0c808fca0e7 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received event network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-910ffff1-2de0-432c-a70e-30a75c499679 req-8393d1ae-8ae7-42a9-b933-d0c808fca0e7 service nova] Acquiring lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-910ffff1-2de0-432c-a70e-30a75c499679 req-8393d1ae-8ae7-42a9-b933-d0c808fca0e7 service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-910ffff1-2de0-432c-a70e-30a75c499679 req-8393d1ae-8ae7-42a9-b933-d0c808fca0e7 service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG nova.compute.manager [req-910ffff1-2de0-432c-a70e-30a75c499679 req-8393d1ae-8ae7-42a9-b933-d0c808fca0e7 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] No waiting events found dispatching network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:48 user nova-compute[71283]: WARNING nova.compute.manager [req-910ffff1-2de0-432c-a70e-30a75c499679 req-8393d1ae-8ae7-42a9-b933-d0c808fca0e7 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received unexpected event network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 for instance with vm_state active and task_state rescuing. Apr 20 10:33:48 user nova-compute[71283]: DEBUG nova.compute.manager [req-910ffff1-2de0-432c-a70e-30a75c499679 req-8393d1ae-8ae7-42a9-b933-d0c808fca0e7 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received event network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-910ffff1-2de0-432c-a70e-30a75c499679 req-8393d1ae-8ae7-42a9-b933-d0c808fca0e7 service nova] Acquiring lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-910ffff1-2de0-432c-a70e-30a75c499679 req-8393d1ae-8ae7-42a9-b933-d0c808fca0e7 service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-910ffff1-2de0-432c-a70e-30a75c499679 req-8393d1ae-8ae7-42a9-b933-d0c808fca0e7 service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG nova.compute.manager [req-910ffff1-2de0-432c-a70e-30a75c499679 req-8393d1ae-8ae7-42a9-b933-d0c808fca0e7 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] No waiting events found dispatching network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:33:48 user nova-compute[71283]: WARNING nova.compute.manager [req-910ffff1-2de0-432c-a70e-30a75c499679 req-8393d1ae-8ae7-42a9-b933-d0c808fca0e7 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received unexpected event network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 for instance with vm_state active and task_state rescuing. Apr 20 10:33:48 user nova-compute[71283]: DEBUG nova.virt.libvirt.host [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Removed pending event for 50cff6dc-1947-417a-8c5f-0b10de5dfd3a due to event {{(pid=71283) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:33:48 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] VM Resumed (Lifecycle Event) Apr 20 10:33:48 user nova-compute[71283]: DEBUG nova.compute.manager [None req-b023db88-2c8f-4ded-bcb2-76944215080a tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:33:48 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] During sync_power_state the instance has a pending task (rescuing). Skip. Apr 20 10:33:48 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:33:48 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] VM Started (Lifecycle Event) Apr 20 10:33:48 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:33:48 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:33:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:50 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:33:50 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] VM Stopped (Lifecycle Event) Apr 20 10:33:50 user nova-compute[71283]: DEBUG nova.compute.manager [None req-6888b845-78af-41bb-b0d2-c989b5bdfac8 None None] [instance: 319eafd3-e6ac-4fcc-92b1-a5f4e60952e8] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:33:50 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:33:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:05 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:05 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:07 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:08 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:08 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 20 10:34:08 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] There are 0 instances to clean {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 20 10:34:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:10 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:12 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:12 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances with incomplete migration {{(pid=71283) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 20 10:34:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:14 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:14 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:14 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:15 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:15 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:34:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json" returned: 0 in 0.154s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue --force-share --output=json" returned: 0 in 0.138s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue --force-share --output=json" returned: 0 in 0.180s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:17 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:34:17 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:34:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=8843MB free_disk=26.544654846191406GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:34:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 64fb0d55-ef35-4386-86fe-00775b83a8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:34:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance e1115ad2-b858-4859-b1bb-2175f7eab867 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:34:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 50cff6dc-1947-417a-8c5f-0b10de5dfd3a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:34:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:34:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:34:17 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:34:17 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:34:17 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:34:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.302s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:19 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:19 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:20 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:20 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:34:20 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:34:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:34:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:34:20 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:34:20 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid 64fb0d55-ef35-4386-86fe-00775b83a8d4 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:34:20 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:21 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Updating instance_info_cache with network_info: [{"id": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "address": "fa:16:3e:fe:b8:27", "network": {"id": "e68379ef-0781-4af6-be68-a311cdded61e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-509902278-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8d9c5dd66fd0496a8e92b41d98fe1727", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap851e6a9d-9d", "ovs_interfaceid": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:34:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-64fb0d55-ef35-4386-86fe-00775b83a8d4" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:34:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:34:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:34:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:34:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:34:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:34:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:34:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:28 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:34:29 user nova-compute[71283]: INFO nova.compute.claims [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Claim successful on node user Apr 20 10:34:29 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG nova.network.neutron [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:34:29 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:34:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG nova.policy [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a04be58cd354d508616edd9d5eeff54', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12e621999b00481c839affc4e83ce37c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:34:29 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Creating image(s) Apr 20 10:34:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "/opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "/opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "/opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.146s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.145s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk 1073741824" returned: 0 in 0.049s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.201s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.133s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Checking if we can resize image /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Cannot resize image /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG nova.objects.instance [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lazy-loading 'migration_context' on Instance uuid 5d73c323-31d8-4695-b8e2-5826c73ebb6b {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG nova.network.neutron [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Successfully created port: a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Ensure instance console log exists: /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:31 user nova-compute[71283]: DEBUG nova.network.neutron [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Successfully updated port: a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:34:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "refresh_cache-5d73c323-31d8-4695-b8e2-5826c73ebb6b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:34:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquired lock "refresh_cache-5d73c323-31d8-4695-b8e2-5826c73ebb6b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:34:31 user nova-compute[71283]: DEBUG nova.network.neutron [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:34:31 user nova-compute[71283]: DEBUG nova.compute.manager [req-cfb8e9a9-db14-4836-98d4-64ec5d91212b req-3416944e-20c8-4d77-a605-38dcc4aca289 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Received event network-changed-a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:34:31 user nova-compute[71283]: DEBUG nova.compute.manager [req-cfb8e9a9-db14-4836-98d4-64ec5d91212b req-3416944e-20c8-4d77-a605-38dcc4aca289 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Refreshing instance network info cache due to event network-changed-a9863408-9295-4836-b0a6-b785f76c092a. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:34:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-cfb8e9a9-db14-4836-98d4-64ec5d91212b req-3416944e-20c8-4d77-a605-38dcc4aca289 service nova] Acquiring lock "refresh_cache-5d73c323-31d8-4695-b8e2-5826c73ebb6b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:34:31 user nova-compute[71283]: DEBUG nova.network.neutron [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.network.neutron [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Updating instance_info_cache with network_info: [{"id": "a9863408-9295-4836-b0a6-b785f76c092a", "address": "fa:16:3e:3f:4b:ee", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9863408-92", "ovs_interfaceid": "a9863408-9295-4836-b0a6-b785f76c092a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Releasing lock "refresh_cache-5d73c323-31d8-4695-b8e2-5826c73ebb6b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Instance network_info: |[{"id": "a9863408-9295-4836-b0a6-b785f76c092a", "address": "fa:16:3e:3f:4b:ee", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9863408-92", "ovs_interfaceid": "a9863408-9295-4836-b0a6-b785f76c092a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-cfb8e9a9-db14-4836-98d4-64ec5d91212b req-3416944e-20c8-4d77-a605-38dcc4aca289 service nova] Acquired lock "refresh_cache-5d73c323-31d8-4695-b8e2-5826c73ebb6b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.network.neutron [req-cfb8e9a9-db14-4836-98d4-64ec5d91212b req-3416944e-20c8-4d77-a605-38dcc4aca289 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Refreshing network info cache for port a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Start _get_guest_xml network_info=[{"id": "a9863408-9295-4836-b0a6-b785f76c092a", "address": "fa:16:3e:3f:4b:ee", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9863408-92", "ovs_interfaceid": "a9863408-9295-4836-b0a6-b785f76c092a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:34:32 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:34:32 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:34:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1225264902',display_name='tempest-AttachVolumeNegativeTest-server-1225264902',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1225264902',id=13,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHlXb2fBMUpwjWcMzz3zehln+ROLLyuvx1FKwns94rOIHkQii3UgvuVBxZIQ5wQRDDXLv5B8P3RJ3QpXQC3Ze1OezsE+YLGNFce+g2+NCB+C/troUPUvbU32x8whbZGnag==',key_name='tempest-keypair-1070174971',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12e621999b00481c839affc4e83ce37c',ramdisk_id='',reservation_id='r-175tdypi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1619866573',owner_user_name='tempest-AttachVolumeNegativeTest-1619866573-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:34:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a04be58cd354d508616edd9d5eeff54',uuid=5d73c323-31d8-4695-b8e2-5826c73ebb6b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a9863408-9295-4836-b0a6-b785f76c092a", "address": "fa:16:3e:3f:4b:ee", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9863408-92", "ovs_interfaceid": "a9863408-9295-4836-b0a6-b785f76c092a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converting VIF {"id": "a9863408-9295-4836-b0a6-b785f76c092a", "address": "fa:16:3e:3f:4b:ee", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9863408-92", "ovs_interfaceid": "a9863408-9295-4836-b0a6-b785f76c092a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:4b:ee,bridge_name='br-int',has_traffic_filtering=True,id=a9863408-9295-4836-b0a6-b785f76c092a,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9863408-92') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.objects.instance [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lazy-loading 'pci_devices' on Instance uuid 5d73c323-31d8-4695-b8e2-5826c73ebb6b {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] End _get_guest_xml xml= Apr 20 10:34:32 user nova-compute[71283]: 5d73c323-31d8-4695-b8e2-5826c73ebb6b Apr 20 10:34:32 user nova-compute[71283]: instance-0000000d Apr 20 10:34:32 user nova-compute[71283]: 131072 Apr 20 10:34:32 user nova-compute[71283]: 1 Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: tempest-AttachVolumeNegativeTest-server-1225264902 Apr 20 10:34:32 user nova-compute[71283]: 2023-04-20 10:34:32 Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: 128 Apr 20 10:34:32 user nova-compute[71283]: 1 Apr 20 10:34:32 user nova-compute[71283]: 0 Apr 20 10:34:32 user nova-compute[71283]: 0 Apr 20 10:34:32 user nova-compute[71283]: 1 Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: tempest-AttachVolumeNegativeTest-1619866573-project-member Apr 20 10:34:32 user nova-compute[71283]: tempest-AttachVolumeNegativeTest-1619866573 Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: OpenStack Foundation Apr 20 10:34:32 user nova-compute[71283]: OpenStack Nova Apr 20 10:34:32 user nova-compute[71283]: 0.0.0 Apr 20 10:34:32 user nova-compute[71283]: 5d73c323-31d8-4695-b8e2-5826c73ebb6b Apr 20 10:34:32 user nova-compute[71283]: 5d73c323-31d8-4695-b8e2-5826c73ebb6b Apr 20 10:34:32 user nova-compute[71283]: Virtual Machine Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: hvm Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Nehalem Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: /dev/urandom Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: Apr 20 10:34:32 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:34:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1225264902',display_name='tempest-AttachVolumeNegativeTest-server-1225264902',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1225264902',id=13,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHlXb2fBMUpwjWcMzz3zehln+ROLLyuvx1FKwns94rOIHkQii3UgvuVBxZIQ5wQRDDXLv5B8P3RJ3QpXQC3Ze1OezsE+YLGNFce+g2+NCB+C/troUPUvbU32x8whbZGnag==',key_name='tempest-keypair-1070174971',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12e621999b00481c839affc4e83ce37c',ramdisk_id='',reservation_id='r-175tdypi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1619866573',owner_user_name='tempest-AttachVolumeNegativeTest-1619866573-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:34:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a04be58cd354d508616edd9d5eeff54',uuid=5d73c323-31d8-4695-b8e2-5826c73ebb6b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a9863408-9295-4836-b0a6-b785f76c092a", "address": "fa:16:3e:3f:4b:ee", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9863408-92", "ovs_interfaceid": "a9863408-9295-4836-b0a6-b785f76c092a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converting VIF {"id": "a9863408-9295-4836-b0a6-b785f76c092a", "address": "fa:16:3e:3f:4b:ee", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9863408-92", "ovs_interfaceid": "a9863408-9295-4836-b0a6-b785f76c092a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:4b:ee,bridge_name='br-int',has_traffic_filtering=True,id=a9863408-9295-4836-b0a6-b785f76c092a,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9863408-92') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG os_vif [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:4b:ee,bridge_name='br-int',has_traffic_filtering=True,id=a9863408-9295-4836-b0a6-b785f76c092a,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9863408-92') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa9863408-92, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa9863408-92, col_values=(('external_ids', {'iface-id': 'a9863408-9295-4836-b0a6-b785f76c092a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3f:4b:ee', 'vm-uuid': '5d73c323-31d8-4695-b8e2-5826c73ebb6b'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:32 user nova-compute[71283]: INFO os_vif [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:4b:ee,bridge_name='br-int',has_traffic_filtering=True,id=a9863408-9295-4836-b0a6-b785f76c092a,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9863408-92') Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] No VIF found with MAC fa:16:3e:3f:4b:ee, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:34:32 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_power_states {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:34:33 user nova-compute[71283]: DEBUG nova.network.neutron [req-cfb8e9a9-db14-4836-98d4-64ec5d91212b req-3416944e-20c8-4d77-a605-38dcc4aca289 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Updated VIF entry in instance network info cache for port a9863408-9295-4836-b0a6-b785f76c092a. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:34:33 user nova-compute[71283]: DEBUG nova.network.neutron [req-cfb8e9a9-db14-4836-98d4-64ec5d91212b req-3416944e-20c8-4d77-a605-38dcc4aca289 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Updating instance_info_cache with network_info: [{"id": "a9863408-9295-4836-b0a6-b785f76c092a", "address": "fa:16:3e:3f:4b:ee", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9863408-92", "ovs_interfaceid": "a9863408-9295-4836-b0a6-b785f76c092a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:34:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-cfb8e9a9-db14-4836-98d4-64ec5d91212b req-3416944e-20c8-4d77-a605-38dcc4aca289 service nova] Releasing lock "refresh_cache-5d73c323-31d8-4695-b8e2-5826c73ebb6b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:34:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG nova.compute.manager [req-6d8b7c82-faba-45fc-b57f-1a4ed58f6c6d req-1424676f-5269-424a-b6d2-0cfbd563e845 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Received event network-vif-plugged-a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-6d8b7c82-faba-45fc-b57f-1a4ed58f6c6d req-1424676f-5269-424a-b6d2-0cfbd563e845 service nova] Acquiring lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-6d8b7c82-faba-45fc-b57f-1a4ed58f6c6d req-1424676f-5269-424a-b6d2-0cfbd563e845 service nova] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-6d8b7c82-faba-45fc-b57f-1a4ed58f6c6d req-1424676f-5269-424a-b6d2-0cfbd563e845 service nova] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.003s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG nova.compute.manager [req-6d8b7c82-faba-45fc-b57f-1a4ed58f6c6d req-1424676f-5269-424a-b6d2-0cfbd563e845 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] No waiting events found dispatching network-vif-plugged-a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:34:34 user nova-compute[71283]: WARNING nova.compute.manager [req-6d8b7c82-faba-45fc-b57f-1a4ed58f6c6d req-1424676f-5269-424a-b6d2-0cfbd563e845 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Received unexpected event network-vif-plugged-a9863408-9295-4836-b0a6-b785f76c092a for instance with vm_state building and task_state spawning. Apr 20 10:34:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Triggering sync for uuid 64fb0d55-ef35-4386-86fe-00775b83a8d4 {{(pid=71283) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Triggering sync for uuid e1115ad2-b858-4859-b1bb-2175f7eab867 {{(pid=71283) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Triggering sync for uuid 50cff6dc-1947-417a-8c5f-0b10de5dfd3a {{(pid=71283) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Triggering sync for uuid 5d73c323-31d8-4695-b8e2-5826c73ebb6b {{(pid=71283) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "64fb0d55-ef35-4386-86fe-00775b83a8d4" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "e1115ad2-b858-4859-b1bb-2175f7eab867" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.059s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.060s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.061s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:34:35 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] VM Resumed (Lifecycle Event) Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:34:35 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Instance spawned successfully. Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:34:35 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:34:35 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] VM Started (Lifecycle Event) Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:34:35 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:34:35 user nova-compute[71283]: INFO nova.compute.manager [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Took 5.85 seconds to spawn the instance on the hypervisor. Apr 20 10:34:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:34:35 user nova-compute[71283]: INFO nova.compute.manager [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Took 6.44 seconds to build instance. Apr 20 10:34:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-1e63bb7f-2b15-44e0-8add-32b00a109f18 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.531s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 1.863s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:35 user nova-compute[71283]: INFO nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:34:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:36 user nova-compute[71283]: DEBUG nova.compute.manager [req-f37d149c-05b1-45b7-aa52-9b5b5c5d138d req-62b51edb-3a70-456f-ae18-4f2704dbf6d0 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Received event network-vif-plugged-a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:34:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f37d149c-05b1-45b7-aa52-9b5b5c5d138d req-62b51edb-3a70-456f-ae18-4f2704dbf6d0 service nova] Acquiring lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f37d149c-05b1-45b7-aa52-9b5b5c5d138d req-62b51edb-3a70-456f-ae18-4f2704dbf6d0 service nova] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f37d149c-05b1-45b7-aa52-9b5b5c5d138d req-62b51edb-3a70-456f-ae18-4f2704dbf6d0 service nova] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:36 user nova-compute[71283]: DEBUG nova.compute.manager [req-f37d149c-05b1-45b7-aa52-9b5b5c5d138d req-62b51edb-3a70-456f-ae18-4f2704dbf6d0 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] No waiting events found dispatching network-vif-plugged-a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:34:36 user nova-compute[71283]: WARNING nova.compute.manager [req-f37d149c-05b1-45b7-aa52-9b5b5c5d138d req-62b51edb-3a70-456f-ae18-4f2704dbf6d0 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Received unexpected event network-vif-plugged-a9863408-9295-4836-b0a6-b785f76c092a for instance with vm_state active and task_state None. Apr 20 10:34:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:42 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:47 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:47 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:56 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:56 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG nova.compute.manager [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:34:57 user nova-compute[71283]: INFO nova.compute.claims [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Claim successful on node user Apr 20 10:34:57 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG nova.compute.manager [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG nova.compute.manager [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG nova.network.neutron [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:34:57 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:34:57 user nova-compute[71283]: DEBUG nova.compute.manager [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG nova.policy [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '878178b690704d048c37c79d596e953b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e5913ed7eb944ddae4b38e1d746e7b9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG nova.compute.manager [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:34:57 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Creating image(s) Apr 20 10:34:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "/opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "/opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "/opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "6f6df8d86578d47dccdada06c204fa5214f9f97d" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "6f6df8d86578d47dccdada06c204fa5214f9f97d" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:57 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d.part --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d.part --force-share --output=json" returned: 0 in 0.138s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG nova.virt.images [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] 121cb977-6d50-466e-b474-423e6e1e40c7 was qcow2, converting to raw {{(pid=71283) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG nova.privsep.utils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71283) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d.part /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d.converted {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d.part /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d.converted" returned: 0 in 0.152s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d.converted --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG nova.network.neutron [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Successfully created port: fb9928c1-f326-4b03-847c-3e450689dedb {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d.converted --force-share --output=json" returned: 0 in 0.133s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "6f6df8d86578d47dccdada06c204fa5214f9f97d" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.766s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d --force-share --output=json" returned: 0 in 0.128s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "6f6df8d86578d47dccdada06c204fa5214f9f97d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "6f6df8d86578d47dccdada06c204fa5214f9f97d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d,backing_fmt=raw /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d,backing_fmt=raw /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk 1073741824" returned: 0 in 0.051s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "6f6df8d86578d47dccdada06c204fa5214f9f97d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.189s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/6f6df8d86578d47dccdada06c204fa5214f9f97d --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Checking if we can resize image /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG nova.network.neutron [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Successfully updated port: fb9928c1-f326-4b03-847c-3e450689dedb {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "refresh_cache-50181cb3-e752-4b72-a09d-e7fdae1edd8f" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquired lock "refresh_cache-50181cb3-e752-4b72-a09d-e7fdae1edd8f" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG nova.network.neutron [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Cannot resize image /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG nova.objects.instance [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lazy-loading 'migration_context' on Instance uuid 50181cb3-e752-4b72-a09d-e7fdae1edd8f {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG nova.compute.manager [req-951e5fd7-bf1d-4714-999a-1a6435f665f5 req-ea068ee4-e921-4340-9313-150cee8d7c43 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Received event network-changed-fb9928c1-f326-4b03-847c-3e450689dedb {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG nova.compute.manager [req-951e5fd7-bf1d-4714-999a-1a6435f665f5 req-ea068ee4-e921-4340-9313-150cee8d7c43 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Refreshing instance network info cache due to event network-changed-fb9928c1-f326-4b03-847c-3e450689dedb. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-951e5fd7-bf1d-4714-999a-1a6435f665f5 req-ea068ee4-e921-4340-9313-150cee8d7c43 service nova] Acquiring lock "refresh_cache-50181cb3-e752-4b72-a09d-e7fdae1edd8f" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Ensure instance console log exists: /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:34:58 user nova-compute[71283]: DEBUG nova.network.neutron [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.network.neutron [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Updating instance_info_cache with network_info: [{"id": "fb9928c1-f326-4b03-847c-3e450689dedb", "address": "fa:16:3e:8a:69:04", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb9928c1-f3", "ovs_interfaceid": "fb9928c1-f326-4b03-847c-3e450689dedb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Releasing lock "refresh_cache-50181cb3-e752-4b72-a09d-e7fdae1edd8f" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Instance network_info: |[{"id": "fb9928c1-f326-4b03-847c-3e450689dedb", "address": "fa:16:3e:8a:69:04", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb9928c1-f3", "ovs_interfaceid": "fb9928c1-f326-4b03-847c-3e450689dedb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-951e5fd7-bf1d-4714-999a-1a6435f665f5 req-ea068ee4-e921-4340-9313-150cee8d7c43 service nova] Acquired lock "refresh_cache-50181cb3-e752-4b72-a09d-e7fdae1edd8f" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.network.neutron [req-951e5fd7-bf1d-4714-999a-1a6435f665f5 req-ea068ee4-e921-4340-9313-150cee8d7c43 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Refreshing network info cache for port fb9928c1-f326-4b03-847c-3e450689dedb {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Start _get_guest_xml network_info=[{"id": "fb9928c1-f326-4b03-847c-3e450689dedb", "address": "fa:16:3e:8a:69:04", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb9928c1-f3", "ovs_interfaceid": "fb9928c1-f326-4b03-847c-3e450689dedb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:34:54Z,direct_url=,disk_format='qcow2',id=121cb977-6d50-466e-b474-423e6e1e40c7,min_disk=0,min_ram=0,name='tempest-scenario-img--1177728255',owner='3e5913ed7eb944ddae4b38e1d746e7b9',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:34:56Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '121cb977-6d50-466e-b474-423e6e1e40c7'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:34:59 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:34:59 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:34:54Z,direct_url=,disk_format='qcow2',id=121cb977-6d50-466e-b474-423e6e1e40c7,min_disk=0,min_ram=0,name='tempest-scenario-img--1177728255',owner='3e5913ed7eb944ddae4b38e1d746e7b9',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:34:56Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:34:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1531913966',display_name='tempest-TestMinimumBasicScenario-server-1531913966',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1531913966',id=14,image_ref='121cb977-6d50-466e-b474-423e6e1e40c7',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPha8UTs+y6zQwYiHYJU8S3tkn97JtlaaAeIMql6mOiK4l9+Zu9OIlPQG4QaIz4bSwaRtyjprIcWT35a8BtT0jDPIeyBy776x5V6SLLxayNSY8lG5hkGVg+EEmbLws1bHQ==',key_name='tempest-TestMinimumBasicScenario-693580580',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e5913ed7eb944ddae4b38e1d746e7b9',ramdisk_id='',reservation_id='r-380fuc8d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='121cb977-6d50-466e-b474-423e6e1e40c7',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-244367270',owner_user_name='tempest-TestMinimumBasicScenario-244367270-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:34:57Z,user_data=None,user_id='878178b690704d048c37c79d596e953b',uuid=50181cb3-e752-4b72-a09d-e7fdae1edd8f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb9928c1-f326-4b03-847c-3e450689dedb", "address": "fa:16:3e:8a:69:04", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb9928c1-f3", "ovs_interfaceid": "fb9928c1-f326-4b03-847c-3e450689dedb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Converting VIF {"id": "fb9928c1-f326-4b03-847c-3e450689dedb", "address": "fa:16:3e:8a:69:04", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb9928c1-f3", "ovs_interfaceid": "fb9928c1-f326-4b03-847c-3e450689dedb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:69:04,bridge_name='br-int',has_traffic_filtering=True,id=fb9928c1-f326-4b03-847c-3e450689dedb,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb9928c1-f3') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.objects.instance [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lazy-loading 'pci_devices' on Instance uuid 50181cb3-e752-4b72-a09d-e7fdae1edd8f {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] End _get_guest_xml xml= Apr 20 10:34:59 user nova-compute[71283]: 50181cb3-e752-4b72-a09d-e7fdae1edd8f Apr 20 10:34:59 user nova-compute[71283]: instance-0000000e Apr 20 10:34:59 user nova-compute[71283]: 131072 Apr 20 10:34:59 user nova-compute[71283]: 1 Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: tempest-TestMinimumBasicScenario-server-1531913966 Apr 20 10:34:59 user nova-compute[71283]: 2023-04-20 10:34:59 Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: 128 Apr 20 10:34:59 user nova-compute[71283]: 1 Apr 20 10:34:59 user nova-compute[71283]: 0 Apr 20 10:34:59 user nova-compute[71283]: 0 Apr 20 10:34:59 user nova-compute[71283]: 1 Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: tempest-TestMinimumBasicScenario-244367270-project-member Apr 20 10:34:59 user nova-compute[71283]: tempest-TestMinimumBasicScenario-244367270 Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: OpenStack Foundation Apr 20 10:34:59 user nova-compute[71283]: OpenStack Nova Apr 20 10:34:59 user nova-compute[71283]: 0.0.0 Apr 20 10:34:59 user nova-compute[71283]: 50181cb3-e752-4b72-a09d-e7fdae1edd8f Apr 20 10:34:59 user nova-compute[71283]: 50181cb3-e752-4b72-a09d-e7fdae1edd8f Apr 20 10:34:59 user nova-compute[71283]: Virtual Machine Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: hvm Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Nehalem Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: /dev/urandom Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: Apr 20 10:34:59 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:34:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1531913966',display_name='tempest-TestMinimumBasicScenario-server-1531913966',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1531913966',id=14,image_ref='121cb977-6d50-466e-b474-423e6e1e40c7',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPha8UTs+y6zQwYiHYJU8S3tkn97JtlaaAeIMql6mOiK4l9+Zu9OIlPQG4QaIz4bSwaRtyjprIcWT35a8BtT0jDPIeyBy776x5V6SLLxayNSY8lG5hkGVg+EEmbLws1bHQ==',key_name='tempest-TestMinimumBasicScenario-693580580',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e5913ed7eb944ddae4b38e1d746e7b9',ramdisk_id='',reservation_id='r-380fuc8d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='121cb977-6d50-466e-b474-423e6e1e40c7',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-244367270',owner_user_name='tempest-TestMinimumBasicScenario-244367270-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:34:57Z,user_data=None,user_id='878178b690704d048c37c79d596e953b',uuid=50181cb3-e752-4b72-a09d-e7fdae1edd8f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb9928c1-f326-4b03-847c-3e450689dedb", "address": "fa:16:3e:8a:69:04", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb9928c1-f3", "ovs_interfaceid": "fb9928c1-f326-4b03-847c-3e450689dedb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Converting VIF {"id": "fb9928c1-f326-4b03-847c-3e450689dedb", "address": "fa:16:3e:8a:69:04", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb9928c1-f3", "ovs_interfaceid": "fb9928c1-f326-4b03-847c-3e450689dedb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:69:04,bridge_name='br-int',has_traffic_filtering=True,id=fb9928c1-f326-4b03-847c-3e450689dedb,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb9928c1-f3') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG os_vif [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:69:04,bridge_name='br-int',has_traffic_filtering=True,id=fb9928c1-f326-4b03-847c-3e450689dedb,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb9928c1-f3') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfb9928c1-f3, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfb9928c1-f3, col_values=(('external_ids', {'iface-id': 'fb9928c1-f326-4b03-847c-3e450689dedb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:69:04', 'vm-uuid': '50181cb3-e752-4b72-a09d-e7fdae1edd8f'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:59 user nova-compute[71283]: INFO os_vif [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:69:04,bridge_name='br-int',has_traffic_filtering=True,id=fb9928c1-f326-4b03-847c-3e450689dedb,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb9928c1-f3') Apr 20 10:34:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] No VIF found with MAC fa:16:3e:8a:69:04, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.network.neutron [req-951e5fd7-bf1d-4714-999a-1a6435f665f5 req-ea068ee4-e921-4340-9313-150cee8d7c43 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Updated VIF entry in instance network info cache for port fb9928c1-f326-4b03-847c-3e450689dedb. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:34:59 user nova-compute[71283]: DEBUG nova.network.neutron [req-951e5fd7-bf1d-4714-999a-1a6435f665f5 req-ea068ee4-e921-4340-9313-150cee8d7c43 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Updating instance_info_cache with network_info: [{"id": "fb9928c1-f326-4b03-847c-3e450689dedb", "address": "fa:16:3e:8a:69:04", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb9928c1-f3", "ovs_interfaceid": "fb9928c1-f326-4b03-847c-3e450689dedb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:35:00 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-951e5fd7-bf1d-4714-999a-1a6435f665f5 req-ea068ee4-e921-4340-9313-150cee8d7c43 service nova] Releasing lock "refresh_cache-50181cb3-e752-4b72-a09d-e7fdae1edd8f" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:35:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:00 user nova-compute[71283]: DEBUG nova.compute.manager [req-de82e55b-66ef-4e85-88a9-d6bfefb78a31 req-e3986a51-7abe-4900-8b12-5a51b652e335 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Received event network-vif-plugged-fb9928c1-f326-4b03-847c-3e450689dedb {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:35:00 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-de82e55b-66ef-4e85-88a9-d6bfefb78a31 req-e3986a51-7abe-4900-8b12-5a51b652e335 service nova] Acquiring lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:35:00 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-de82e55b-66ef-4e85-88a9-d6bfefb78a31 req-e3986a51-7abe-4900-8b12-5a51b652e335 service nova] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:35:00 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-de82e55b-66ef-4e85-88a9-d6bfefb78a31 req-e3986a51-7abe-4900-8b12-5a51b652e335 service nova] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:35:00 user nova-compute[71283]: DEBUG nova.compute.manager [req-de82e55b-66ef-4e85-88a9-d6bfefb78a31 req-e3986a51-7abe-4900-8b12-5a51b652e335 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] No waiting events found dispatching network-vif-plugged-fb9928c1-f326-4b03-847c-3e450689dedb {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:35:00 user nova-compute[71283]: WARNING nova.compute.manager [req-de82e55b-66ef-4e85-88a9-d6bfefb78a31 req-e3986a51-7abe-4900-8b12-5a51b652e335 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Received unexpected event network-vif-plugged-fb9928c1-f326-4b03-847c-3e450689dedb for instance with vm_state building and task_state spawning. Apr 20 10:35:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:35:02 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] VM Resumed (Lifecycle Event) Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.compute.manager [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:35:02 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Instance spawned successfully. Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:35:02 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:35:02 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] VM Started (Lifecycle Event) Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:35:02 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:35:02 user nova-compute[71283]: INFO nova.compute.manager [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Took 5.35 seconds to spawn the instance on the hypervisor. Apr 20 10:35:02 user nova-compute[71283]: DEBUG nova.compute.manager [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:35:02 user nova-compute[71283]: INFO nova.compute.manager [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Took 5.92 seconds to build instance. Apr 20 10:35:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-433c3dc1-4f10-4eec-80a7-a203dca1d6a5 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.028s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:35:03 user nova-compute[71283]: DEBUG nova.compute.manager [req-185a680a-44cc-4c80-b9c4-29a78246ca55 req-ff51403f-8a3d-412b-b6b0-38f0a0cefe7e service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Received event network-vif-plugged-fb9928c1-f326-4b03-847c-3e450689dedb {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:35:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-185a680a-44cc-4c80-b9c4-29a78246ca55 req-ff51403f-8a3d-412b-b6b0-38f0a0cefe7e service nova] Acquiring lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:35:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-185a680a-44cc-4c80-b9c4-29a78246ca55 req-ff51403f-8a3d-412b-b6b0-38f0a0cefe7e service nova] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:35:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-185a680a-44cc-4c80-b9c4-29a78246ca55 req-ff51403f-8a3d-412b-b6b0-38f0a0cefe7e service nova] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:35:03 user nova-compute[71283]: DEBUG nova.compute.manager [req-185a680a-44cc-4c80-b9c4-29a78246ca55 req-ff51403f-8a3d-412b-b6b0-38f0a0cefe7e service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] No waiting events found dispatching network-vif-plugged-fb9928c1-f326-4b03-847c-3e450689dedb {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:35:03 user nova-compute[71283]: WARNING nova.compute.manager [req-185a680a-44cc-4c80-b9c4-29a78246ca55 req-ff51403f-8a3d-412b-b6b0-38f0a0cefe7e service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Received unexpected event network-vif-plugged-fb9928c1-f326-4b03-847c-3e450689dedb for instance with vm_state active and task_state None. Apr 20 10:35:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:16 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:35:16 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:35:16 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:35:16 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:35:16 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:35:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:35:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:35:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:35:16 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:35:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:35:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:35:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:35:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:35:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue --force-share --output=json" returned: 0 in 0.156s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:35:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:35:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue --force-share --output=json" returned: 0 in 0.129s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:35:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:35:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:35:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:35:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:35:19 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:35:19 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=8613MB free_disk=26.443695068359375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 64fb0d55-ef35-4386-86fe-00775b83a8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance e1115ad2-b858-4859-b1bb-2175f7eab867 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 50cff6dc-1947-417a-8c5f-0b10de5dfd3a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 5d73c323-31d8-4695-b8e2-5826c73ebb6b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 50181cb3-e752-4b72-a09d-e7fdae1edd8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing inventories for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating ProviderTree inventory for provider bdbc83bd-9307-4e20-8e3d-430b77499399 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating inventory in ProviderTree for provider bdbc83bd-9307-4e20-8e3d-430b77499399 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing aggregate associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, aggregates: None {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing trait associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:35:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.576s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:35:22 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:35:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:35:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-e1115ad2-b858-4859-b1bb-2175f7eab867" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:35:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-e1115ad2-b858-4859-b1bb-2175f7eab867" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:35:22 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:35:23 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Updating instance_info_cache with network_info: [{"id": "c47ac25a-c0d0-4443-8759-c573281b63f9", "address": "fa:16:3e:bd:ab:60", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47ac25a-c0", "ovs_interfaceid": "c47ac25a-c0d0-4443-8759-c573281b63f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:35:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-e1115ad2-b858-4859-b1bb-2175f7eab867" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:35:23 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:35:23 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:35:23 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:35:23 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:35:23 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:35:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:35:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:35:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:35:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:35:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:35:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:35:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:35:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:35:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:35:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:35:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:36:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:36:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:36:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:36:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:36:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:36:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:16 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:36:16 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:36:16 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:36:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:16 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:36:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:36:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:36:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:36:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:36:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:36:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:36:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:36:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:36:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:36:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:36:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:36:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:36:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:36:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:36:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:36:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:36:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:36:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:36:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:36:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk.rescue --force-share --output=json" returned: 0 in 0.145s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:36:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:36:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:36:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:36:18 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:36:18 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:36:18 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:36:18 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=8745MB free_disk=26.44277572631836GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:36:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 64fb0d55-ef35-4386-86fe-00775b83a8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:36:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance e1115ad2-b858-4859-b1bb-2175f7eab867 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:36:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 50cff6dc-1947-417a-8c5f-0b10de5dfd3a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:36:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 5d73c323-31d8-4695-b8e2-5826c73ebb6b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:36:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 50181cb3-e752-4b72-a09d-e7fdae1edd8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:36:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:36:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:36:19 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:36:19 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:36:19 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:36:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.281s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:20 user nova-compute[71283]: DEBUG nova.compute.manager [req-8a51c1eb-4f6d-4f53-8f07-815474832bd0 req-c9206468-6c36-4f65-912e-a6527c98ab46 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Received event network-changed-a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:36:20 user nova-compute[71283]: DEBUG nova.compute.manager [req-8a51c1eb-4f6d-4f53-8f07-815474832bd0 req-c9206468-6c36-4f65-912e-a6527c98ab46 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Refreshing instance network info cache due to event network-changed-a9863408-9295-4836-b0a6-b785f76c092a. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:36:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8a51c1eb-4f6d-4f53-8f07-815474832bd0 req-c9206468-6c36-4f65-912e-a6527c98ab46 service nova] Acquiring lock "refresh_cache-5d73c323-31d8-4695-b8e2-5826c73ebb6b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:36:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8a51c1eb-4f6d-4f53-8f07-815474832bd0 req-c9206468-6c36-4f65-912e-a6527c98ab46 service nova] Acquired lock "refresh_cache-5d73c323-31d8-4695-b8e2-5826c73ebb6b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:36:20 user nova-compute[71283]: DEBUG nova.network.neutron [req-8a51c1eb-4f6d-4f53-8f07-815474832bd0 req-c9206468-6c36-4f65-912e-a6527c98ab46 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Refreshing network info cache for port a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:36:20 user nova-compute[71283]: DEBUG nova.network.neutron [req-8a51c1eb-4f6d-4f53-8f07-815474832bd0 req-c9206468-6c36-4f65-912e-a6527c98ab46 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Updated VIF entry in instance network info cache for port a9863408-9295-4836-b0a6-b785f76c092a. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:36:20 user nova-compute[71283]: DEBUG nova.network.neutron [req-8a51c1eb-4f6d-4f53-8f07-815474832bd0 req-c9206468-6c36-4f65-912e-a6527c98ab46 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Updating instance_info_cache with network_info: [{"id": "a9863408-9295-4836-b0a6-b785f76c092a", "address": "fa:16:3e:3f:4b:ee", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.114", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9863408-92", "ovs_interfaceid": "a9863408-9295-4836-b0a6-b785f76c092a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:36:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8a51c1eb-4f6d-4f53-8f07-815474832bd0 req-c9206468-6c36-4f65-912e-a6527c98ab46 service nova] Releasing lock "refresh_cache-5d73c323-31d8-4695-b8e2-5826c73ebb6b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-50cff6dc-1947-417a-8c5f-0b10de5dfd3a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-50cff6dc-1947-417a-8c5f-0b10de5dfd3a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:21 user nova-compute[71283]: INFO nova.compute.manager [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Terminating instance Apr 20 10:36:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:36:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG nova.compute.manager [req-9dc2b74a-e3dc-43c4-8032-6f768d1ab3ce req-262da206-e57f-489d-8e88-5b5cd4a430f1 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Received event network-vif-unplugged-a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9dc2b74a-e3dc-43c4-8032-6f768d1ab3ce req-262da206-e57f-489d-8e88-5b5cd4a430f1 service nova] Acquiring lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9dc2b74a-e3dc-43c4-8032-6f768d1ab3ce req-262da206-e57f-489d-8e88-5b5cd4a430f1 service nova] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9dc2b74a-e3dc-43c4-8032-6f768d1ab3ce req-262da206-e57f-489d-8e88-5b5cd4a430f1 service nova] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG nova.compute.manager [req-9dc2b74a-e3dc-43c4-8032-6f768d1ab3ce req-262da206-e57f-489d-8e88-5b5cd4a430f1 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] No waiting events found dispatching network-vif-unplugged-a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG nova.compute.manager [req-9dc2b74a-e3dc-43c4-8032-6f768d1ab3ce req-262da206-e57f-489d-8e88-5b5cd4a430f1 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Received event network-vif-unplugged-a9863408-9295-4836-b0a6-b785f76c092a for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Updating instance_info_cache with network_info: [{"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-50cff6dc-1947-417a-8c5f-0b10de5dfd3a" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:22 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Instance destroyed successfully. Apr 20 10:36:22 user nova-compute[71283]: DEBUG nova.objects.instance [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lazy-loading 'resources' on Instance uuid 5d73c323-31d8-4695-b8e2-5826c73ebb6b {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:34:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1225264902',display_name='tempest-AttachVolumeNegativeTest-server-1225264902',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1225264902',id=13,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHlXb2fBMUpwjWcMzz3zehln+ROLLyuvx1FKwns94rOIHkQii3UgvuVBxZIQ5wQRDDXLv5B8P3RJ3QpXQC3Ze1OezsE+YLGNFce+g2+NCB+C/troUPUvbU32x8whbZGnag==',key_name='tempest-keypair-1070174971',keypairs=,launch_index=0,launched_at=2023-04-20T10:34:35Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='12e621999b00481c839affc4e83ce37c',ramdisk_id='',reservation_id='r-175tdypi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-1619866573',owner_user_name='tempest-AttachVolumeNegativeTest-1619866573-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:34:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a04be58cd354d508616edd9d5eeff54',uuid=5d73c323-31d8-4695-b8e2-5826c73ebb6b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a9863408-9295-4836-b0a6-b785f76c092a", "address": "fa:16:3e:3f:4b:ee", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.114", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9863408-92", "ovs_interfaceid": "a9863408-9295-4836-b0a6-b785f76c092a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converting VIF {"id": "a9863408-9295-4836-b0a6-b785f76c092a", "address": "fa:16:3e:3f:4b:ee", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.114", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9863408-92", "ovs_interfaceid": "a9863408-9295-4836-b0a6-b785f76c092a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3f:4b:ee,bridge_name='br-int',has_traffic_filtering=True,id=a9863408-9295-4836-b0a6-b785f76c092a,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9863408-92') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG os_vif [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:4b:ee,bridge_name='br-int',has_traffic_filtering=True,id=a9863408-9295-4836-b0a6-b785f76c092a,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9863408-92') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa9863408-92, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:36:22 user nova-compute[71283]: INFO os_vif [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:4b:ee,bridge_name='br-int',has_traffic_filtering=True,id=a9863408-9295-4836-b0a6-b785f76c092a,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9863408-92') Apr 20 10:36:22 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Deleting instance files /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b_del Apr 20 10:36:22 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Deletion of /opt/stack/data/nova/instances/5d73c323-31d8-4695-b8e2-5826c73ebb6b_del complete Apr 20 10:36:22 user nova-compute[71283]: INFO nova.compute.manager [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 20 10:36:22 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:36:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:36:23 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:36:23 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Took 0.79 seconds to deallocate network for instance. Apr 20 10:36:23 user nova-compute[71283]: DEBUG nova.compute.manager [req-9b6d6ffb-9550-4dcb-9c89-2f219e66773e req-078ab178-6582-47a8-99b9-fc15b3c1ee1e service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Received event network-vif-deleted-a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:36:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:23 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:36:23 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:36:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.239s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:23 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Deleted allocations for instance 5d73c323-31d8-4695-b8e2-5826c73ebb6b Apr 20 10:36:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6efdec39-8296-466f-98eb-3273b652a0e5 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.869s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-8d26257c-b507-4c48-851f-243f8b2e01dd req-2d1d61b0-e6b3-43b4-90d6-a7c741b6dce0 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Received event network-vif-plugged-a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:36:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8d26257c-b507-4c48-851f-243f8b2e01dd req-2d1d61b0-e6b3-43b4-90d6-a7c741b6dce0 service nova] Acquiring lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8d26257c-b507-4c48-851f-243f8b2e01dd req-2d1d61b0-e6b3-43b4-90d6-a7c741b6dce0 service nova] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8d26257c-b507-4c48-851f-243f8b2e01dd req-2d1d61b0-e6b3-43b4-90d6-a7c741b6dce0 service nova] Lock "5d73c323-31d8-4695-b8e2-5826c73ebb6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-8d26257c-b507-4c48-851f-243f8b2e01dd req-2d1d61b0-e6b3-43b4-90d6-a7c741b6dce0 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] No waiting events found dispatching network-vif-plugged-a9863408-9295-4836-b0a6-b785f76c092a {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:36:24 user nova-compute[71283]: WARNING nova.compute.manager [req-8d26257c-b507-4c48-851f-243f8b2e01dd req-2d1d61b0-e6b3-43b4-90d6-a7c741b6dce0 service nova] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Received unexpected event network-vif-plugged-a9863408-9295-4836-b0a6-b785f76c092a for instance with vm_state deleted and task_state None. Apr 20 10:36:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:27 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:28 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:36:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:37 user nova-compute[71283]: INFO nova.compute.manager [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Terminating instance Apr 20 10:36:37 user nova-compute[71283]: DEBUG nova.compute.manager [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:36:37 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] VM Stopped (Lifecycle Event) Apr 20 10:36:37 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e8e59453-47fa-4ae6-8eba-3b2439f51e70 None None] [instance: 5d73c323-31d8-4695-b8e2-5826c73ebb6b] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG nova.compute.manager [req-f30e2a1b-70f0-4060-8006-0f220081e0dc req-e55f88f2-6589-424f-aa0b-2d54ac6da6ac service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received event network-vif-unplugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f30e2a1b-70f0-4060-8006-0f220081e0dc req-e55f88f2-6589-424f-aa0b-2d54ac6da6ac service nova] Acquiring lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f30e2a1b-70f0-4060-8006-0f220081e0dc req-e55f88f2-6589-424f-aa0b-2d54ac6da6ac service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f30e2a1b-70f0-4060-8006-0f220081e0dc req-e55f88f2-6589-424f-aa0b-2d54ac6da6ac service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG nova.compute.manager [req-f30e2a1b-70f0-4060-8006-0f220081e0dc req-e55f88f2-6589-424f-aa0b-2d54ac6da6ac service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] No waiting events found dispatching network-vif-unplugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG nova.compute.manager [req-f30e2a1b-70f0-4060-8006-0f220081e0dc req-e55f88f2-6589-424f-aa0b-2d54ac6da6ac service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received event network-vif-unplugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:37 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Instance destroyed successfully. Apr 20 10:36:37 user nova-compute[71283]: DEBUG nova.objects.instance [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lazy-loading 'resources' on Instance uuid 50cff6dc-1947-417a-8c5f-0b10de5dfd3a {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:31:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-325022623',display_name='tempest-ServerRescueNegativeTestJSON-server-325022623',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-325022623',id=12,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T10:33:48Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d6d8880a9263444cba94725a83974403',ramdisk_id='',reservation_id='r-5my5271p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-678567479',owner_user_name='tempest-ServerRescueNegativeTestJSON-678567479-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:33:48Z,user_data=None,user_id='729f2fdafa8e471e8f0de0c8323c36b5',uuid=50cff6dc-1947-417a-8c5f-0b10de5dfd3a,vcpu_model=,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converting VIF {"id": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "address": "fa:16:3e:c7:53:84", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap56f36d99-98", "ovs_interfaceid": "56f36d99-9847-4d4d-bf9b-c1c2244bac79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c7:53:84,bridge_name='br-int',has_traffic_filtering=True,id=56f36d99-9847-4d4d-bf9b-c1c2244bac79,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56f36d99-98') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG os_vif [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c7:53:84,bridge_name='br-int',has_traffic_filtering=True,id=56f36d99-9847-4d4d-bf9b-c1c2244bac79,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56f36d99-98') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56f36d99-98, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:36:37 user nova-compute[71283]: INFO os_vif [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c7:53:84,bridge_name='br-int',has_traffic_filtering=True,id=56f36d99-9847-4d4d-bf9b-c1c2244bac79,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56f36d99-98') Apr 20 10:36:37 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Deleting instance files /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a_del Apr 20 10:36:37 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Deletion of /opt/stack/data/nova/instances/50cff6dc-1947-417a-8c5f-0b10de5dfd3a_del complete Apr 20 10:36:38 user nova-compute[71283]: INFO nova.compute.manager [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 20 10:36:38 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:36:38 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:36:38 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:36:38 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:36:38 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Took 0.81 seconds to deallocate network for instance. Apr 20 10:36:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:39 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:36:39 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:36:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.189s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:39 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Deleted allocations for instance 50cff6dc-1947-417a-8c5f-0b10de5dfd3a Apr 20 10:36:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-6cede009-e454-4acd-9ded-d6bb20b23f36 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.846s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:39 user nova-compute[71283]: DEBUG nova.compute.manager [req-e6692020-6c2c-47f9-9dc6-f7caed1ae2fb req-54474ef2-ec3a-49a0-b9e6-95f0d673c1c4 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received event network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:36:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-e6692020-6c2c-47f9-9dc6-f7caed1ae2fb req-54474ef2-ec3a-49a0-b9e6-95f0d673c1c4 service nova] Acquiring lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-e6692020-6c2c-47f9-9dc6-f7caed1ae2fb req-54474ef2-ec3a-49a0-b9e6-95f0d673c1c4 service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-e6692020-6c2c-47f9-9dc6-f7caed1ae2fb req-54474ef2-ec3a-49a0-b9e6-95f0d673c1c4 service nova] Lock "50cff6dc-1947-417a-8c5f-0b10de5dfd3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:39 user nova-compute[71283]: DEBUG nova.compute.manager [req-e6692020-6c2c-47f9-9dc6-f7caed1ae2fb req-54474ef2-ec3a-49a0-b9e6-95f0d673c1c4 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] No waiting events found dispatching network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:36:39 user nova-compute[71283]: WARNING nova.compute.manager [req-e6692020-6c2c-47f9-9dc6-f7caed1ae2fb req-54474ef2-ec3a-49a0-b9e6-95f0d673c1c4 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received unexpected event network-vif-plugged-56f36d99-9847-4d4d-bf9b-c1c2244bac79 for instance with vm_state deleted and task_state None. Apr 20 10:36:39 user nova-compute[71283]: DEBUG nova.compute.manager [req-e6692020-6c2c-47f9-9dc6-f7caed1ae2fb req-54474ef2-ec3a-49a0-b9e6-95f0d673c1c4 service nova] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Received event network-vif-deleted-56f36d99-9847-4d4d-bf9b-c1c2244bac79 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:36:42 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:47 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:49 user nova-compute[71283]: INFO nova.compute.manager [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Terminating instance Apr 20 10:36:49 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG nova.compute.manager [req-d9453410-2b99-42c7-9929-55eed655c493 req-f905ad73-ce72-454c-9e5a-ea0d21ccdd95 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Received event network-vif-unplugged-fb9928c1-f326-4b03-847c-3e450689dedb {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-d9453410-2b99-42c7-9929-55eed655c493 req-f905ad73-ce72-454c-9e5a-ea0d21ccdd95 service nova] Acquiring lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-d9453410-2b99-42c7-9929-55eed655c493 req-f905ad73-ce72-454c-9e5a-ea0d21ccdd95 service nova] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-d9453410-2b99-42c7-9929-55eed655c493 req-f905ad73-ce72-454c-9e5a-ea0d21ccdd95 service nova] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG nova.compute.manager [req-d9453410-2b99-42c7-9929-55eed655c493 req-f905ad73-ce72-454c-9e5a-ea0d21ccdd95 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] No waiting events found dispatching network-vif-unplugged-fb9928c1-f326-4b03-847c-3e450689dedb {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:36:49 user nova-compute[71283]: DEBUG nova.compute.manager [req-d9453410-2b99-42c7-9929-55eed655c493 req-f905ad73-ce72-454c-9e5a-ea0d21ccdd95 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Received event network-vif-unplugged-fb9928c1-f326-4b03-847c-3e450689dedb for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:36:50 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Instance destroyed successfully. Apr 20 10:36:50 user nova-compute[71283]: DEBUG nova.objects.instance [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lazy-loading 'resources' on Instance uuid 50181cb3-e752-4b72-a09d-e7fdae1edd8f {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:34:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1531913966',display_name='tempest-TestMinimumBasicScenario-server-1531913966',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1531913966',id=14,image_ref='121cb977-6d50-466e-b474-423e6e1e40c7',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPha8UTs+y6zQwYiHYJU8S3tkn97JtlaaAeIMql6mOiK4l9+Zu9OIlPQG4QaIz4bSwaRtyjprIcWT35a8BtT0jDPIeyBy776x5V6SLLxayNSY8lG5hkGVg+EEmbLws1bHQ==',key_name='tempest-TestMinimumBasicScenario-693580580',keypairs=,launch_index=0,launched_at=2023-04-20T10:35:02Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='3e5913ed7eb944ddae4b38e1d746e7b9',ramdisk_id='',reservation_id='r-380fuc8d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='121cb977-6d50-466e-b474-423e6e1e40c7',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-244367270',owner_user_name='tempest-TestMinimumBasicScenario-244367270-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:35:03Z,user_data=None,user_id='878178b690704d048c37c79d596e953b',uuid=50181cb3-e752-4b72-a09d-e7fdae1edd8f,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fb9928c1-f326-4b03-847c-3e450689dedb", "address": "fa:16:3e:8a:69:04", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb9928c1-f3", "ovs_interfaceid": "fb9928c1-f326-4b03-847c-3e450689dedb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Converting VIF {"id": "fb9928c1-f326-4b03-847c-3e450689dedb", "address": "fa:16:3e:8a:69:04", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb9928c1-f3", "ovs_interfaceid": "fb9928c1-f326-4b03-847c-3e450689dedb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:69:04,bridge_name='br-int',has_traffic_filtering=True,id=fb9928c1-f326-4b03-847c-3e450689dedb,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb9928c1-f3') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG os_vif [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:69:04,bridge_name='br-int',has_traffic_filtering=True,id=fb9928c1-f326-4b03-847c-3e450689dedb,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb9928c1-f3') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb9928c1-f3, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:36:50 user nova-compute[71283]: INFO os_vif [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:69:04,bridge_name='br-int',has_traffic_filtering=True,id=fb9928c1-f326-4b03-847c-3e450689dedb,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb9928c1-f3') Apr 20 10:36:50 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Deleting instance files /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f_del Apr 20 10:36:50 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Deletion of /opt/stack/data/nova/instances/50181cb3-e752-4b72-a09d-e7fdae1edd8f_del complete Apr 20 10:36:50 user nova-compute[71283]: INFO nova.compute.manager [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 20 10:36:50 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:36:50 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Took 0.48 seconds to deallocate network for instance. Apr 20 10:36:50 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:36:50 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.178s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:50 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Deleted allocations for instance 50181cb3-e752-4b72-a09d-e7fdae1edd8f Apr 20 10:36:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f7347785-a9f8-43cb-9e2f-e4102f828b07 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.704s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:51 user nova-compute[71283]: DEBUG nova.compute.manager [req-282ae317-c1ba-4f1c-9d32-400408de849b req-3a91f7be-d951-40ba-9647-ecb7d8ae1351 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Received event network-vif-plugged-fb9928c1-f326-4b03-847c-3e450689dedb {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:36:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-282ae317-c1ba-4f1c-9d32-400408de849b req-3a91f7be-d951-40ba-9647-ecb7d8ae1351 service nova] Acquiring lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:36:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-282ae317-c1ba-4f1c-9d32-400408de849b req-3a91f7be-d951-40ba-9647-ecb7d8ae1351 service nova] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:36:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-282ae317-c1ba-4f1c-9d32-400408de849b req-3a91f7be-d951-40ba-9647-ecb7d8ae1351 service nova] Lock "50181cb3-e752-4b72-a09d-e7fdae1edd8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:36:51 user nova-compute[71283]: DEBUG nova.compute.manager [req-282ae317-c1ba-4f1c-9d32-400408de849b req-3a91f7be-d951-40ba-9647-ecb7d8ae1351 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] No waiting events found dispatching network-vif-plugged-fb9928c1-f326-4b03-847c-3e450689dedb {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:36:51 user nova-compute[71283]: WARNING nova.compute.manager [req-282ae317-c1ba-4f1c-9d32-400408de849b req-3a91f7be-d951-40ba-9647-ecb7d8ae1351 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Received unexpected event network-vif-plugged-fb9928c1-f326-4b03-847c-3e450689dedb for instance with vm_state deleted and task_state None. Apr 20 10:36:51 user nova-compute[71283]: DEBUG nova.compute.manager [req-282ae317-c1ba-4f1c-9d32-400408de849b req-3a91f7be-d951-40ba-9647-ecb7d8ae1351 service nova] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Received event network-vif-deleted-fb9928c1-f326-4b03-847c-3e450689dedb {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:36:52 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:36:52 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] VM Stopped (Lifecycle Event) Apr 20 10:36:52 user nova-compute[71283]: DEBUG nova.compute.manager [None req-74bae648-315e-4887-89ac-30b070a27b61 None None] [instance: 50cff6dc-1947-417a-8c5f-0b10de5dfd3a] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:36:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:37:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:37:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:37:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:37:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:05 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:37:05 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] VM Stopped (Lifecycle Event) Apr 20 10:37:05 user nova-compute[71283]: DEBUG nova.compute.manager [None req-c28b47c7-136a-4031-9b24-1d4b99a0c18f None None] [instance: 50181cb3-e752-4b72-a09d-e7fdae1edd8f] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:37:05 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:10 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:37:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "baecb573-a3f1-42db-bc1b-788220753a7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "baecb573-a3f1-42db-bc1b-788220753a7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:15 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:37:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:15 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:37:15 user nova-compute[71283]: INFO nova.compute.claims [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Claim successful on node user Apr 20 10:37:15 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:37:15 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:37:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.249s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:15 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:37:16 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:37:16 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG nova.policy [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a04be58cd354d508616edd9d5eeff54', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12e621999b00481c839affc4e83ce37c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:37:16 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Creating image(s) Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "/opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "/opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "/opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.127s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk 1073741824" returned: 0 in 0.053s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.196s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.127s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Checking if we can resize image /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Successfully created port: 79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Cannot resize image /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG nova.objects.instance [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lazy-loading 'migration_context' on Instance uuid baecb573-a3f1-42db-bc1b-788220753a7d {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Ensure instance console log exists: /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Successfully updated port: 79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "refresh_cache-baecb573-a3f1-42db-bc1b-788220753a7d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquired lock "refresh_cache-baecb573-a3f1-42db-bc1b-788220753a7d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.compute.manager [req-564d1526-1eba-4f62-90b7-f37efec8cfac req-eb7580c8-c1a7-46cf-bd1b-87bbb6767c8f service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received event network-changed-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.compute.manager [req-564d1526-1eba-4f62-90b7-f37efec8cfac req-eb7580c8-c1a7-46cf-bd1b-87bbb6767c8f service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Refreshing instance network info cache due to event network-changed-79499d87-7e81-458e-ac87-ddde51eaefc6. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-564d1526-1eba-4f62-90b7-f37efec8cfac req-eb7580c8-c1a7-46cf-bd1b-87bbb6767c8f service nova] Acquiring lock "refresh_cache-baecb573-a3f1-42db-bc1b-788220753a7d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.network.neutron [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Updating instance_info_cache with network_info: [{"id": "79499d87-7e81-458e-ac87-ddde51eaefc6", "address": "fa:16:3e:21:9b:f7", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap79499d87-7e", "ovs_interfaceid": "79499d87-7e81-458e-ac87-ddde51eaefc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Releasing lock "refresh_cache-baecb573-a3f1-42db-bc1b-788220753a7d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Instance network_info: |[{"id": "79499d87-7e81-458e-ac87-ddde51eaefc6", "address": "fa:16:3e:21:9b:f7", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap79499d87-7e", "ovs_interfaceid": "79499d87-7e81-458e-ac87-ddde51eaefc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-564d1526-1eba-4f62-90b7-f37efec8cfac req-eb7580c8-c1a7-46cf-bd1b-87bbb6767c8f service nova] Acquired lock "refresh_cache-baecb573-a3f1-42db-bc1b-788220753a7d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.network.neutron [req-564d1526-1eba-4f62-90b7-f37efec8cfac req-eb7580c8-c1a7-46cf-bd1b-87bbb6767c8f service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Refreshing network info cache for port 79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Start _get_guest_xml network_info=[{"id": "79499d87-7e81-458e-ac87-ddde51eaefc6", "address": "fa:16:3e:21:9b:f7", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap79499d87-7e", "ovs_interfaceid": "79499d87-7e81-458e-ac87-ddde51eaefc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:37:17 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:37:17 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:37:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1283873387',display_name='tempest-AttachVolumeNegativeTest-server-1283873387',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1283873387',id=15,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJsFZ6zFqNNpK8NNtdhopwc6Wl3+Dy9fVkPXioOSVb5zIBlsJr3Vfpw4JSmE38Ewd24wugKZHZWDCHIcSgnyz2KZEoYLhZutGGub8KFvP0CYywEgGYJIN+RHNtlh3mNweQ==',key_name='tempest-keypair-58831496',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12e621999b00481c839affc4e83ce37c',ramdisk_id='',reservation_id='r-mav5ivfr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1619866573',owner_user_name='tempest-AttachVolumeNegativeTest-1619866573-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:37:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a04be58cd354d508616edd9d5eeff54',uuid=baecb573-a3f1-42db-bc1b-788220753a7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "79499d87-7e81-458e-ac87-ddde51eaefc6", "address": "fa:16:3e:21:9b:f7", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap79499d87-7e", "ovs_interfaceid": "79499d87-7e81-458e-ac87-ddde51eaefc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converting VIF {"id": "79499d87-7e81-458e-ac87-ddde51eaefc6", "address": "fa:16:3e:21:9b:f7", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap79499d87-7e", "ovs_interfaceid": "79499d87-7e81-458e-ac87-ddde51eaefc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:9b:f7,bridge_name='br-int',has_traffic_filtering=True,id=79499d87-7e81-458e-ac87-ddde51eaefc6,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79499d87-7e') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.objects.instance [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lazy-loading 'pci_devices' on Instance uuid baecb573-a3f1-42db-bc1b-788220753a7d {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] End _get_guest_xml xml= Apr 20 10:37:17 user nova-compute[71283]: baecb573-a3f1-42db-bc1b-788220753a7d Apr 20 10:37:17 user nova-compute[71283]: instance-0000000f Apr 20 10:37:17 user nova-compute[71283]: 131072 Apr 20 10:37:17 user nova-compute[71283]: 1 Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: tempest-AttachVolumeNegativeTest-server-1283873387 Apr 20 10:37:17 user nova-compute[71283]: 2023-04-20 10:37:17 Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: 128 Apr 20 10:37:17 user nova-compute[71283]: 1 Apr 20 10:37:17 user nova-compute[71283]: 0 Apr 20 10:37:17 user nova-compute[71283]: 0 Apr 20 10:37:17 user nova-compute[71283]: 1 Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: tempest-AttachVolumeNegativeTest-1619866573-project-member Apr 20 10:37:17 user nova-compute[71283]: tempest-AttachVolumeNegativeTest-1619866573 Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: OpenStack Foundation Apr 20 10:37:17 user nova-compute[71283]: OpenStack Nova Apr 20 10:37:17 user nova-compute[71283]: 0.0.0 Apr 20 10:37:17 user nova-compute[71283]: baecb573-a3f1-42db-bc1b-788220753a7d Apr 20 10:37:17 user nova-compute[71283]: baecb573-a3f1-42db-bc1b-788220753a7d Apr 20 10:37:17 user nova-compute[71283]: Virtual Machine Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: hvm Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Nehalem Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: /dev/urandom Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: Apr 20 10:37:17 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:37:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1283873387',display_name='tempest-AttachVolumeNegativeTest-server-1283873387',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1283873387',id=15,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJsFZ6zFqNNpK8NNtdhopwc6Wl3+Dy9fVkPXioOSVb5zIBlsJr3Vfpw4JSmE38Ewd24wugKZHZWDCHIcSgnyz2KZEoYLhZutGGub8KFvP0CYywEgGYJIN+RHNtlh3mNweQ==',key_name='tempest-keypair-58831496',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12e621999b00481c839affc4e83ce37c',ramdisk_id='',reservation_id='r-mav5ivfr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1619866573',owner_user_name='tempest-AttachVolumeNegativeTest-1619866573-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:37:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a04be58cd354d508616edd9d5eeff54',uuid=baecb573-a3f1-42db-bc1b-788220753a7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "79499d87-7e81-458e-ac87-ddde51eaefc6", "address": "fa:16:3e:21:9b:f7", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap79499d87-7e", "ovs_interfaceid": "79499d87-7e81-458e-ac87-ddde51eaefc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converting VIF {"id": "79499d87-7e81-458e-ac87-ddde51eaefc6", "address": "fa:16:3e:21:9b:f7", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap79499d87-7e", "ovs_interfaceid": "79499d87-7e81-458e-ac87-ddde51eaefc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:9b:f7,bridge_name='br-int',has_traffic_filtering=True,id=79499d87-7e81-458e-ac87-ddde51eaefc6,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79499d87-7e') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG os_vif [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:9b:f7,bridge_name='br-int',has_traffic_filtering=True,id=79499d87-7e81-458e-ac87-ddde51eaefc6,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79499d87-7e') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79499d87-7e, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap79499d87-7e, col_values=(('external_ids', {'iface-id': '79499d87-7e81-458e-ac87-ddde51eaefc6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:21:9b:f7', 'vm-uuid': 'baecb573-a3f1-42db-bc1b-788220753a7d'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:17 user nova-compute[71283]: INFO os_vif [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:9b:f7,bridge_name='br-int',has_traffic_filtering=True,id=79499d87-7e81-458e-ac87-ddde51eaefc6,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79499d87-7e') Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:37:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] No VIF found with MAC fa:16:3e:21:9b:f7, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:37:18 user nova-compute[71283]: DEBUG nova.network.neutron [req-564d1526-1eba-4f62-90b7-f37efec8cfac req-eb7580c8-c1a7-46cf-bd1b-87bbb6767c8f service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Updated VIF entry in instance network info cache for port 79499d87-7e81-458e-ac87-ddde51eaefc6. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:37:18 user nova-compute[71283]: DEBUG nova.network.neutron [req-564d1526-1eba-4f62-90b7-f37efec8cfac req-eb7580c8-c1a7-46cf-bd1b-87bbb6767c8f service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Updating instance_info_cache with network_info: [{"id": "79499d87-7e81-458e-ac87-ddde51eaefc6", "address": "fa:16:3e:21:9b:f7", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap79499d87-7e", "ovs_interfaceid": "79499d87-7e81-458e-ac87-ddde51eaefc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:37:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-564d1526-1eba-4f62-90b7-f37efec8cfac req-eb7580c8-c1a7-46cf-bd1b-87bbb6767c8f service nova] Releasing lock "refresh_cache-baecb573-a3f1-42db-bc1b-788220753a7d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:37:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:37:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:18 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:18 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:37:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:19 user nova-compute[71283]: DEBUG nova.compute.manager [req-05dd9862-68d2-4e0e-a82d-225d15ed6ae7 req-6d6104fc-a4a8-4e74-81a1-62b8ba2cc741 service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received event network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:37:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-05dd9862-68d2-4e0e-a82d-225d15ed6ae7 req-6d6104fc-a4a8-4e74-81a1-62b8ba2cc741 service nova] Acquiring lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-05dd9862-68d2-4e0e-a82d-225d15ed6ae7 req-6d6104fc-a4a8-4e74-81a1-62b8ba2cc741 service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-05dd9862-68d2-4e0e-a82d-225d15ed6ae7 req-6d6104fc-a4a8-4e74-81a1-62b8ba2cc741 service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:19 user nova-compute[71283]: DEBUG nova.compute.manager [req-05dd9862-68d2-4e0e-a82d-225d15ed6ae7 req-6d6104fc-a4a8-4e74-81a1-62b8ba2cc741 service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] No waiting events found dispatching network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:37:19 user nova-compute[71283]: WARNING nova.compute.manager [req-05dd9862-68d2-4e0e-a82d-225d15ed6ae7 req-6d6104fc-a4a8-4e74-81a1-62b8ba2cc741 service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received unexpected event network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 for instance with vm_state building and task_state spawning. Apr 20 10:37:19 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:19 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:19 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867/disk --force-share --output=json" returned: 0 in 0.126s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:37:21 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] VM Resumed (Lifecycle Event) Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.manager [req-99e53c9f-a6ea-4542-bc0d-cde2e7b4e6bf req-857ac03a-55d1-4b53-bd36-8754b52ead91 service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received event network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-99e53c9f-a6ea-4542-bc0d-cde2e7b4e6bf req-857ac03a-55d1-4b53-bd36-8754b52ead91 service nova] Acquiring lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-99e53c9f-a6ea-4542-bc0d-cde2e7b4e6bf req-857ac03a-55d1-4b53-bd36-8754b52ead91 service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-99e53c9f-a6ea-4542-bc0d-cde2e7b4e6bf req-857ac03a-55d1-4b53-bd36-8754b52ead91 service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.manager [req-99e53c9f-a6ea-4542-bc0d-cde2e7b4e6bf req-857ac03a-55d1-4b53-bd36-8754b52ead91 service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] No waiting events found dispatching network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:37:21 user nova-compute[71283]: WARNING nova.compute.manager [req-99e53c9f-a6ea-4542-bc0d-cde2e7b4e6bf req-857ac03a-55d1-4b53-bd36-8754b52ead91 service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received unexpected event network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 for instance with vm_state building and task_state spawning. Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Acquiring lock "64fb0d55-ef35-4386-86fe-00775b83a8d4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Acquiring lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:21 user nova-compute[71283]: INFO nova.compute.manager [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Terminating instance Apr 20 10:37:21 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Instance spawned successfully. Apr 20 10:37:21 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:37:21 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=8912MB free_disk=26.518627166748047GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:37:21 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:37:21 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] VM Started (Lifecycle Event) Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 64fb0d55-ef35-4386-86fe-00775b83a8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance e1115ad2-b858-4859-b1bb-2175f7eab867 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance baecb573-a3f1-42db-bc1b-788220753a7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:21 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:37:21 user nova-compute[71283]: INFO nova.compute.manager [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Took 5.61 seconds to spawn the instance on the hypervisor. Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:21 user nova-compute[71283]: INFO nova.compute.manager [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Took 6.22 seconds to build instance. Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-f736c356-5d06-4c49-ac43-efd99cde0dc7 tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "baecb573-a3f1-42db-bc1b-788220753a7d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.316s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:37:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.325s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG nova.compute.manager [req-98e50c6b-0c12-435e-ab9e-02b78e0ece86 req-728e1139-3fe7-4e64-98fe-901fce0c5a3d service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Received event network-vif-unplugged-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-98e50c6b-0c12-435e-ab9e-02b78e0ece86 req-728e1139-3fe7-4e64-98fe-901fce0c5a3d service nova] Acquiring lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-98e50c6b-0c12-435e-ab9e-02b78e0ece86 req-728e1139-3fe7-4e64-98fe-901fce0c5a3d service nova] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-98e50c6b-0c12-435e-ab9e-02b78e0ece86 req-728e1139-3fe7-4e64-98fe-901fce0c5a3d service nova] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG nova.compute.manager [req-98e50c6b-0c12-435e-ab9e-02b78e0ece86 req-728e1139-3fe7-4e64-98fe-901fce0c5a3d service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] No waiting events found dispatching network-vif-unplugged-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG nova.compute.manager [req-98e50c6b-0c12-435e-ab9e-02b78e0ece86 req-728e1139-3fe7-4e64-98fe-901fce0c5a3d service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Received event network-vif-unplugged-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:37:22 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Instance destroyed successfully. Apr 20 10:37:22 user nova-compute[71283]: DEBUG nova.objects.instance [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lazy-loading 'resources' on Instance uuid 64fb0d55-ef35-4386-86fe-00775b83a8d4 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:28:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1446443267',display_name='tempest-ServerActionsTestJSON-server-1446443267',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-1446443267',id=4,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJHK0F1a8yp8Xvld+jR/l8lxpwDFkBJKvOkeodCIMdNIsy3rZb2cz1b4Lbds32NZ3fZXZBlAND7bKaMCernVFVPVOsJFwYkZamLCbeIJtO4ahR/NTi6fJvZoUyyvCYQG3g==',key_name='tempest-keypair-748576161',keypairs=,launch_index=0,launched_at=2023-04-20T10:28:47Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='8d9c5dd66fd0496a8e92b41d98fe1727',ramdisk_id='',reservation_id='r-01ns16zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerActionsTestJSON-692026271',owner_user_name='tempest-ServerActionsTestJSON-692026271-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:28:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc5470a076e14c218cf04fad713ce074',uuid=64fb0d55-ef35-4386-86fe-00775b83a8d4,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "address": "fa:16:3e:fe:b8:27", "network": {"id": "e68379ef-0781-4af6-be68-a311cdded61e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-509902278-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8d9c5dd66fd0496a8e92b41d98fe1727", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap851e6a9d-9d", "ovs_interfaceid": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Converting VIF {"id": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "address": "fa:16:3e:fe:b8:27", "network": {"id": "e68379ef-0781-4af6-be68-a311cdded61e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-509902278-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8d9c5dd66fd0496a8e92b41d98fe1727", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap851e6a9d-9d", "ovs_interfaceid": "851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fe:b8:27,bridge_name='br-int',has_traffic_filtering=True,id=851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1,network=Network(e68379ef-0781-4af6-be68-a311cdded61e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap851e6a9d-9d') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG os_vif [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fe:b8:27,bridge_name='br-int',has_traffic_filtering=True,id=851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1,network=Network(e68379ef-0781-4af6-be68-a311cdded61e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap851e6a9d-9d') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap851e6a9d-9d, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:37:22 user nova-compute[71283]: INFO os_vif [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fe:b8:27,bridge_name='br-int',has_traffic_filtering=True,id=851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1,network=Network(e68379ef-0781-4af6-be68-a311cdded61e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap851e6a9d-9d') Apr 20 10:37:22 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Deleting instance files /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4_del Apr 20 10:37:22 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Deletion of /opt/stack/data/nova/instances/64fb0d55-ef35-4386-86fe-00775b83a8d4_del complete Apr 20 10:37:22 user nova-compute[71283]: INFO nova.compute.manager [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Took 0.87 seconds to destroy the instance on the hypervisor. Apr 20 10:37:22 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:37:22 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:37:23 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:37:23 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Took 0.78 seconds to deallocate network for instance. Apr 20 10:37:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:23 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:37:23 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:37:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.151s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:23 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Deleted allocations for instance 64fb0d55-ef35-4386-86fe-00775b83a8d4 Apr 20 10:37:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-084bb752-ca30-4a20-9c35-0f04c0bcaac1 tempest-ServerActionsTestJSON-692026271 tempest-ServerActionsTestJSON-692026271-project-member] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.984s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:23 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:37:23 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:37:23 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-e1115ad2-b858-4859-b1bb-2175f7eab867" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-e1115ad2-b858-4859-b1bb-2175f7eab867" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid e1115ad2-b858-4859-b1bb-2175f7eab867 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-91043abc-1e11-4631-8758-8bedbdbf647a req-7d6134cb-91f1-4bb2-9e2b-5b987c54b375 service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Received event network-vif-plugged-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-91043abc-1e11-4631-8758-8bedbdbf647a req-7d6134cb-91f1-4bb2-9e2b-5b987c54b375 service nova] Acquiring lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-91043abc-1e11-4631-8758-8bedbdbf647a req-7d6134cb-91f1-4bb2-9e2b-5b987c54b375 service nova] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-91043abc-1e11-4631-8758-8bedbdbf647a req-7d6134cb-91f1-4bb2-9e2b-5b987c54b375 service nova] Lock "64fb0d55-ef35-4386-86fe-00775b83a8d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-91043abc-1e11-4631-8758-8bedbdbf647a req-7d6134cb-91f1-4bb2-9e2b-5b987c54b375 service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] No waiting events found dispatching network-vif-plugged-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:37:24 user nova-compute[71283]: WARNING nova.compute.manager [req-91043abc-1e11-4631-8758-8bedbdbf647a req-7d6134cb-91f1-4bb2-9e2b-5b987c54b375 service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Received unexpected event network-vif-plugged-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 for instance with vm_state deleted and task_state None. Apr 20 10:37:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-91043abc-1e11-4631-8758-8bedbdbf647a req-7d6134cb-91f1-4bb2-9e2b-5b987c54b375 service nova] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Received event network-vif-deleted-851e6a9d-9d4a-461c-b0c1-a7233bd9a8d1 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Updating instance_info_cache with network_info: [{"id": "c47ac25a-c0d0-4443-8759-c573281b63f9", "address": "fa:16:3e:bd:ab:60", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47ac25a-c0", "ovs_interfaceid": "c47ac25a-c0d0-4443-8759-c573281b63f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-e1115ad2-b858-4859-b1bb-2175f7eab867" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:37:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:37:27 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "e1115ad2-b858-4859-b1bb-2175f7eab867" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.008s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:27 user nova-compute[71283]: INFO nova.compute.manager [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Terminating instance Apr 20 10:37:27 user nova-compute[71283]: DEBUG nova.compute.manager [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG nova.compute.manager [req-a354d5bb-d55f-4c86-8e60-caf12f350c5f req-228ade6a-9d0e-42bd-9dd5-954298cb8533 service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Received event network-vif-unplugged-c47ac25a-c0d0-4443-8759-c573281b63f9 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a354d5bb-d55f-4c86-8e60-caf12f350c5f req-228ade6a-9d0e-42bd-9dd5-954298cb8533 service nova] Acquiring lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a354d5bb-d55f-4c86-8e60-caf12f350c5f req-228ade6a-9d0e-42bd-9dd5-954298cb8533 service nova] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a354d5bb-d55f-4c86-8e60-caf12f350c5f req-228ade6a-9d0e-42bd-9dd5-954298cb8533 service nova] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG nova.compute.manager [req-a354d5bb-d55f-4c86-8e60-caf12f350c5f req-228ade6a-9d0e-42bd-9dd5-954298cb8533 service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] No waiting events found dispatching network-vif-unplugged-c47ac25a-c0d0-4443-8759-c573281b63f9 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG nova.compute.manager [req-a354d5bb-d55f-4c86-8e60-caf12f350c5f req-228ade6a-9d0e-42bd-9dd5-954298cb8533 service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Received event network-vif-unplugged-c47ac25a-c0d0-4443-8759-c573281b63f9 for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:37:28 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Instance destroyed successfully. Apr 20 10:37:28 user nova-compute[71283]: DEBUG nova.objects.instance [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lazy-loading 'resources' on Instance uuid e1115ad2-b858-4859-b1bb-2175f7eab867 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:31:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1901028417',display_name='tempest-ServerRescueNegativeTestJSON-server-1901028417',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1901028417',id=11,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T10:31:59Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d6d8880a9263444cba94725a83974403',ramdisk_id='',reservation_id='r-6r1f814t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-678567479',owner_user_name='tempest-ServerRescueNegativeTestJSON-678567479-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:32:00Z,user_data=None,user_id='729f2fdafa8e471e8f0de0c8323c36b5',uuid=e1115ad2-b858-4859-b1bb-2175f7eab867,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c47ac25a-c0d0-4443-8759-c573281b63f9", "address": "fa:16:3e:bd:ab:60", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47ac25a-c0", "ovs_interfaceid": "c47ac25a-c0d0-4443-8759-c573281b63f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converting VIF {"id": "c47ac25a-c0d0-4443-8759-c573281b63f9", "address": "fa:16:3e:bd:ab:60", "network": {"id": "ae2c97fe-5f89-4830-9303-439b5956dc16", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1597087439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d6d8880a9263444cba94725a83974403", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47ac25a-c0", "ovs_interfaceid": "c47ac25a-c0d0-4443-8759-c573281b63f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bd:ab:60,bridge_name='br-int',has_traffic_filtering=True,id=c47ac25a-c0d0-4443-8759-c573281b63f9,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47ac25a-c0') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG os_vif [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bd:ab:60,bridge_name='br-int',has_traffic_filtering=True,id=c47ac25a-c0d0-4443-8759-c573281b63f9,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47ac25a-c0') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc47ac25a-c0, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:37:28 user nova-compute[71283]: INFO os_vif [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bd:ab:60,bridge_name='br-int',has_traffic_filtering=True,id=c47ac25a-c0d0-4443-8759-c573281b63f9,network=Network(ae2c97fe-5f89-4830-9303-439b5956dc16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47ac25a-c0') Apr 20 10:37:28 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Deleting instance files /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867_del Apr 20 10:37:28 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Deletion of /opt/stack/data/nova/instances/e1115ad2-b858-4859-b1bb-2175f7eab867_del complete Apr 20 10:37:28 user nova-compute[71283]: INFO nova.compute.manager [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 20 10:37:28 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:37:28 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:37:29 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:37:29 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Took 0.65 seconds to deallocate network for instance. Apr 20 10:37:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:29 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:37:29 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:37:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:29 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Deleted allocations for instance e1115ad2-b858-4859-b1bb-2175f7eab867 Apr 20 10:37:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2935dc28-96d3-44e7-948e-2fc2d5454b80 tempest-ServerRescueNegativeTestJSON-678567479 tempest-ServerRescueNegativeTestJSON-678567479-project-member] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.718s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:30 user nova-compute[71283]: DEBUG nova.compute.manager [req-08c6c40e-bd73-406b-96f9-25146247fcf7 req-7a51eaf9-2ba1-4a2c-9e90-d292d618ecec service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Received event network-vif-plugged-c47ac25a-c0d0-4443-8759-c573281b63f9 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:37:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-08c6c40e-bd73-406b-96f9-25146247fcf7 req-7a51eaf9-2ba1-4a2c-9e90-d292d618ecec service nova] Acquiring lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-08c6c40e-bd73-406b-96f9-25146247fcf7 req-7a51eaf9-2ba1-4a2c-9e90-d292d618ecec service nova] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-08c6c40e-bd73-406b-96f9-25146247fcf7 req-7a51eaf9-2ba1-4a2c-9e90-d292d618ecec service nova] Lock "e1115ad2-b858-4859-b1bb-2175f7eab867-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:30 user nova-compute[71283]: DEBUG nova.compute.manager [req-08c6c40e-bd73-406b-96f9-25146247fcf7 req-7a51eaf9-2ba1-4a2c-9e90-d292d618ecec service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] No waiting events found dispatching network-vif-plugged-c47ac25a-c0d0-4443-8759-c573281b63f9 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:37:30 user nova-compute[71283]: WARNING nova.compute.manager [req-08c6c40e-bd73-406b-96f9-25146247fcf7 req-7a51eaf9-2ba1-4a2c-9e90-d292d618ecec service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Received unexpected event network-vif-plugged-c47ac25a-c0d0-4443-8759-c573281b63f9 for instance with vm_state deleted and task_state None. Apr 20 10:37:30 user nova-compute[71283]: DEBUG nova.compute.manager [req-08c6c40e-bd73-406b-96f9-25146247fcf7 req-7a51eaf9-2ba1-4a2c-9e90-d292d618ecec service nova] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Received event network-vif-deleted-c47ac25a-c0d0-4443-8759-c573281b63f9 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:37:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:37 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:37:37 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] VM Stopped (Lifecycle Event) Apr 20 10:37:37 user nova-compute[71283]: DEBUG nova.compute.manager [None req-9e3e4050-fe69-46c2-ae66-568721ade652 None None] [instance: 64fb0d55-ef35-4386-86fe-00775b83a8d4] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:37:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:42 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:37:43 user nova-compute[71283]: INFO nova.compute.claims [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Claim successful on node user Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.232s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.network.neutron [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:37:43 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.policy [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '878178b690704d048c37c79d596e953b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e5913ed7eb944ddae4b38e1d746e7b9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:37:43 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Creating image(s) Apr 20 10:37:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "/opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "/opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "/opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:37:43 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] VM Stopped (Lifecycle Event) Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-f205233a-528a-4bcb-a65d-b3bd78af090e None None] [instance: e1115ad2-b858-4859-b1bb-2175f7eab867] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc.part --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc.part --force-share --output=json" returned: 0 in 0.137s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.virt.images [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] 69262355-7a22-44a5-8ebf-cd108cd9fd5c was qcow2, converting to raw {{(pid=71283) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG nova.privsep.utils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71283) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 10:37:43 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc.part /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc.converted {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG nova.network.neutron [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Successfully created port: 4117b617-d8de-4936-aedb-a7d408fe0b34 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc.part /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc.converted" returned: 0 in 0.126s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc.converted --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc.converted --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.734s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc,backing_fmt=raw /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc,backing_fmt=raw /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk 1073741824" returned: 0 in 0.045s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.185s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f3f8e65fcdb4d4a6185ef501c5ea0262d8df41bc --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Checking if we can resize image /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG nova.network.neutron [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Successfully updated port: 4117b617-d8de-4936-aedb-a7d408fe0b34 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG nova.compute.manager [req-80a79d1f-0769-4fd5-ba38-722bea76e38b req-8d330f98-9045-450e-b38f-05cea0688f8f service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Received event network-changed-4117b617-d8de-4936-aedb-a7d408fe0b34 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG nova.compute.manager [req-80a79d1f-0769-4fd5-ba38-722bea76e38b req-8d330f98-9045-450e-b38f-05cea0688f8f service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Refreshing instance network info cache due to event network-changed-4117b617-d8de-4936-aedb-a7d408fe0b34. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-80a79d1f-0769-4fd5-ba38-722bea76e38b req-8d330f98-9045-450e-b38f-05cea0688f8f service nova] Acquiring lock "refresh_cache-a8d3f0e1-b178-4197-b40a-8ebb2548432d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-80a79d1f-0769-4fd5-ba38-722bea76e38b req-8d330f98-9045-450e-b38f-05cea0688f8f service nova] Acquired lock "refresh_cache-a8d3f0e1-b178-4197-b40a-8ebb2548432d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG nova.network.neutron [req-80a79d1f-0769-4fd5-ba38-722bea76e38b req-8d330f98-9045-450e-b38f-05cea0688f8f service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Refreshing network info cache for port 4117b617-d8de-4936-aedb-a7d408fe0b34 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "refresh_cache-a8d3f0e1-b178-4197-b40a-8ebb2548432d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG nova.network.neutron [req-80a79d1f-0769-4fd5-ba38-722bea76e38b req-8d330f98-9045-450e-b38f-05cea0688f8f service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Cannot resize image /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG nova.objects.instance [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lazy-loading 'migration_context' on Instance uuid a8d3f0e1-b178-4197-b40a-8ebb2548432d {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Ensure instance console log exists: /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.network.neutron [req-80a79d1f-0769-4fd5-ba38-722bea76e38b req-8d330f98-9045-450e-b38f-05cea0688f8f service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-80a79d1f-0769-4fd5-ba38-722bea76e38b req-8d330f98-9045-450e-b38f-05cea0688f8f service nova] Releasing lock "refresh_cache-a8d3f0e1-b178-4197-b40a-8ebb2548432d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquired lock "refresh_cache-a8d3f0e1-b178-4197-b40a-8ebb2548432d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.network.neutron [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.network.neutron [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.network.neutron [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Updating instance_info_cache with network_info: [{"id": "4117b617-d8de-4936-aedb-a7d408fe0b34", "address": "fa:16:3e:c0:6a:4f", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4117b617-d8", "ovs_interfaceid": "4117b617-d8de-4936-aedb-a7d408fe0b34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Releasing lock "refresh_cache-a8d3f0e1-b178-4197-b40a-8ebb2548432d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Instance network_info: |[{"id": "4117b617-d8de-4936-aedb-a7d408fe0b34", "address": "fa:16:3e:c0:6a:4f", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4117b617-d8", "ovs_interfaceid": "4117b617-d8de-4936-aedb-a7d408fe0b34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Start _get_guest_xml network_info=[{"id": "4117b617-d8de-4936-aedb-a7d408fe0b34", "address": "fa:16:3e:c0:6a:4f", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4117b617-d8", "ovs_interfaceid": "4117b617-d8de-4936-aedb-a7d408fe0b34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:37:40Z,direct_url=,disk_format='qcow2',id=69262355-7a22-44a5-8ebf-cd108cd9fd5c,min_disk=0,min_ram=0,name='tempest-scenario-img--689445022',owner='3e5913ed7eb944ddae4b38e1d746e7b9',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:37:42Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '69262355-7a22-44a5-8ebf-cd108cd9fd5c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:37:45 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:37:45 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:37:40Z,direct_url=,disk_format='qcow2',id=69262355-7a22-44a5-8ebf-cd108cd9fd5c,min_disk=0,min_ram=0,name='tempest-scenario-img--689445022',owner='3e5913ed7eb944ddae4b38e1d746e7b9',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:37:42Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:37:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1705189616',display_name='tempest-TestMinimumBasicScenario-server-1705189616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1705189616',id=16,image_ref='69262355-7a22-44a5-8ebf-cd108cd9fd5c',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO6n0xssgLNJ9YwnQ9V93NcaNq9fsgPjdGXY0hrReturf7mL9Oi05p0B44KoX3tqIuN1tQo+iN1yrvDbet7dn3Flu5MucjF+Z3ihqA8dypjS6Ft9XWhbqbaTjqdOQGkwAw==',key_name='tempest-TestMinimumBasicScenario-956051906',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e5913ed7eb944ddae4b38e1d746e7b9',ramdisk_id='',reservation_id='r-u1dvr1tl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='69262355-7a22-44a5-8ebf-cd108cd9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-244367270',owner_user_name='tempest-TestMinimumBasicScenario-244367270-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:37:43Z,user_data=None,user_id='878178b690704d048c37c79d596e953b',uuid=a8d3f0e1-b178-4197-b40a-8ebb2548432d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4117b617-d8de-4936-aedb-a7d408fe0b34", "address": "fa:16:3e:c0:6a:4f", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4117b617-d8", "ovs_interfaceid": "4117b617-d8de-4936-aedb-a7d408fe0b34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Converting VIF {"id": "4117b617-d8de-4936-aedb-a7d408fe0b34", "address": "fa:16:3e:c0:6a:4f", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4117b617-d8", "ovs_interfaceid": "4117b617-d8de-4936-aedb-a7d408fe0b34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:6a:4f,bridge_name='br-int',has_traffic_filtering=True,id=4117b617-d8de-4936-aedb-a7d408fe0b34,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4117b617-d8') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.objects.instance [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lazy-loading 'pci_devices' on Instance uuid a8d3f0e1-b178-4197-b40a-8ebb2548432d {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] End _get_guest_xml xml= Apr 20 10:37:45 user nova-compute[71283]: a8d3f0e1-b178-4197-b40a-8ebb2548432d Apr 20 10:37:45 user nova-compute[71283]: instance-00000010 Apr 20 10:37:45 user nova-compute[71283]: 131072 Apr 20 10:37:45 user nova-compute[71283]: 1 Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: tempest-TestMinimumBasicScenario-server-1705189616 Apr 20 10:37:45 user nova-compute[71283]: 2023-04-20 10:37:45 Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: 128 Apr 20 10:37:45 user nova-compute[71283]: 1 Apr 20 10:37:45 user nova-compute[71283]: 0 Apr 20 10:37:45 user nova-compute[71283]: 0 Apr 20 10:37:45 user nova-compute[71283]: 1 Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: tempest-TestMinimumBasicScenario-244367270-project-member Apr 20 10:37:45 user nova-compute[71283]: tempest-TestMinimumBasicScenario-244367270 Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: OpenStack Foundation Apr 20 10:37:45 user nova-compute[71283]: OpenStack Nova Apr 20 10:37:45 user nova-compute[71283]: 0.0.0 Apr 20 10:37:45 user nova-compute[71283]: a8d3f0e1-b178-4197-b40a-8ebb2548432d Apr 20 10:37:45 user nova-compute[71283]: a8d3f0e1-b178-4197-b40a-8ebb2548432d Apr 20 10:37:45 user nova-compute[71283]: Virtual Machine Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: hvm Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Nehalem Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: /dev/urandom Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: Apr 20 10:37:45 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:37:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1705189616',display_name='tempest-TestMinimumBasicScenario-server-1705189616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1705189616',id=16,image_ref='69262355-7a22-44a5-8ebf-cd108cd9fd5c',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO6n0xssgLNJ9YwnQ9V93NcaNq9fsgPjdGXY0hrReturf7mL9Oi05p0B44KoX3tqIuN1tQo+iN1yrvDbet7dn3Flu5MucjF+Z3ihqA8dypjS6Ft9XWhbqbaTjqdOQGkwAw==',key_name='tempest-TestMinimumBasicScenario-956051906',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e5913ed7eb944ddae4b38e1d746e7b9',ramdisk_id='',reservation_id='r-u1dvr1tl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='69262355-7a22-44a5-8ebf-cd108cd9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-244367270',owner_user_name='tempest-TestMinimumBasicScenario-244367270-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:37:43Z,user_data=None,user_id='878178b690704d048c37c79d596e953b',uuid=a8d3f0e1-b178-4197-b40a-8ebb2548432d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4117b617-d8de-4936-aedb-a7d408fe0b34", "address": "fa:16:3e:c0:6a:4f", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4117b617-d8", "ovs_interfaceid": "4117b617-d8de-4936-aedb-a7d408fe0b34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Converting VIF {"id": "4117b617-d8de-4936-aedb-a7d408fe0b34", "address": "fa:16:3e:c0:6a:4f", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4117b617-d8", "ovs_interfaceid": "4117b617-d8de-4936-aedb-a7d408fe0b34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:6a:4f,bridge_name='br-int',has_traffic_filtering=True,id=4117b617-d8de-4936-aedb-a7d408fe0b34,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4117b617-d8') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG os_vif [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:6a:4f,bridge_name='br-int',has_traffic_filtering=True,id=4117b617-d8de-4936-aedb-a7d408fe0b34,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4117b617-d8') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4117b617-d8, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4117b617-d8, col_values=(('external_ids', {'iface-id': '4117b617-d8de-4936-aedb-a7d408fe0b34', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:6a:4f', 'vm-uuid': 'a8d3f0e1-b178-4197-b40a-8ebb2548432d'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:45 user nova-compute[71283]: INFO os_vif [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:6a:4f,bridge_name='br-int',has_traffic_filtering=True,id=4117b617-d8de-4936-aedb-a7d408fe0b34,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4117b617-d8') Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:37:45 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] No VIF found with MAC fa:16:3e:c0:6a:4f, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:37:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:47 user nova-compute[71283]: DEBUG nova.compute.manager [req-a229acc9-153e-4bb4-bc78-c43cf932ec6d req-817fc706-6d17-435e-8d33-36f29d97b036 service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Received event network-vif-plugged-4117b617-d8de-4936-aedb-a7d408fe0b34 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:37:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a229acc9-153e-4bb4-bc78-c43cf932ec6d req-817fc706-6d17-435e-8d33-36f29d97b036 service nova] Acquiring lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a229acc9-153e-4bb4-bc78-c43cf932ec6d req-817fc706-6d17-435e-8d33-36f29d97b036 service nova] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a229acc9-153e-4bb4-bc78-c43cf932ec6d req-817fc706-6d17-435e-8d33-36f29d97b036 service nova] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:47 user nova-compute[71283]: DEBUG nova.compute.manager [req-a229acc9-153e-4bb4-bc78-c43cf932ec6d req-817fc706-6d17-435e-8d33-36f29d97b036 service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] No waiting events found dispatching network-vif-plugged-4117b617-d8de-4936-aedb-a7d408fe0b34 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:37:47 user nova-compute[71283]: WARNING nova.compute.manager [req-a229acc9-153e-4bb4-bc78-c43cf932ec6d req-817fc706-6d17-435e-8d33-36f29d97b036 service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Received unexpected event network-vif-plugged-4117b617-d8de-4936-aedb-a7d408fe0b34 for instance with vm_state building and task_state spawning. Apr 20 10:37:47 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:47 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:48 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:37:48 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] VM Resumed (Lifecycle Event) Apr 20 10:37:48 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:37:48 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:37:48 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Instance spawned successfully. Apr 20 10:37:48 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:37:48 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:37:48 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:37:48 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:37:48 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:37:48 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:37:48 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:37:48 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:37:48 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:37:48 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:37:48 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:37:48 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] VM Started (Lifecycle Event) Apr 20 10:37:49 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:37:49 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:37:49 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:37:49 user nova-compute[71283]: INFO nova.compute.manager [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Took 5.56 seconds to spawn the instance on the hypervisor. Apr 20 10:37:49 user nova-compute[71283]: DEBUG nova.compute.manager [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:37:49 user nova-compute[71283]: DEBUG nova.compute.manager [req-b2ac732d-f0e4-4658-866a-72c875335bf3 req-19ad21fa-f718-4204-a116-d078d3a539f5 service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Received event network-vif-plugged-4117b617-d8de-4936-aedb-a7d408fe0b34 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:37:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b2ac732d-f0e4-4658-866a-72c875335bf3 req-19ad21fa-f718-4204-a116-d078d3a539f5 service nova] Acquiring lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:37:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b2ac732d-f0e4-4658-866a-72c875335bf3 req-19ad21fa-f718-4204-a116-d078d3a539f5 service nova] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:37:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b2ac732d-f0e4-4658-866a-72c875335bf3 req-19ad21fa-f718-4204-a116-d078d3a539f5 service nova] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:49 user nova-compute[71283]: DEBUG nova.compute.manager [req-b2ac732d-f0e4-4658-866a-72c875335bf3 req-19ad21fa-f718-4204-a116-d078d3a539f5 service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] No waiting events found dispatching network-vif-plugged-4117b617-d8de-4936-aedb-a7d408fe0b34 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:37:49 user nova-compute[71283]: WARNING nova.compute.manager [req-b2ac732d-f0e4-4658-866a-72c875335bf3 req-19ad21fa-f718-4204-a116-d078d3a539f5 service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Received unexpected event network-vif-plugged-4117b617-d8de-4936-aedb-a7d408fe0b34 for instance with vm_state building and task_state spawning. Apr 20 10:37:49 user nova-compute[71283]: INFO nova.compute.manager [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Took 6.09 seconds to build instance. Apr 20 10:37:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-7e5dd0f9-febb-4069-be8c-4f34968dd8f6 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.183s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:37:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:50 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:37:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:38:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:05 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:10 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:17 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:38:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:38:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:20 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:20 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:38:20 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:38:20 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:38:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:20 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:38:20 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:21 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:21 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:21 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:21 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:21 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:21 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:38:21 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:38:21 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=8916MB free_disk=26.49951171875GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:38:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:21 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance baecb573-a3f1-42db-bc1b-788220753a7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:38:21 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance a8d3f0e1-b178-4197-b40a-8ebb2548432d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:38:21 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:38:21 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:38:22 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:38:22 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:38:22 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:38:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:23 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:38:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "4277833d-7119-4772-ba6d-dfa7368a652b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "4277833d-7119-4772-ba6d-dfa7368a652b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:38:24 user nova-compute[71283]: INFO nova.compute.claims [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Claim successful on node user Apr 20 10:38:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.235s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:38:24 user nova-compute[71283]: DEBUG nova.network.neutron [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:38:24 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:38:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG nova.policy [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1b7d729afb5942b8a1753ffaa4d5a268', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5bcb8e3930844068bd2b496914a4d764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG nova.compute.manager [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:38:25 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Creating image(s) Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "/opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "/opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "/opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.139s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk 1073741824" returned: 0 in 0.044s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.190s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.128s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Checking if we can resize image /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Cannot resize image /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG nova.objects.instance [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lazy-loading 'migration_context' on Instance uuid 4277833d-7119-4772-ba6d-dfa7368a652b {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Ensure instance console log exists: /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG nova.network.neutron [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Successfully created port: 02bf30f9-074c-4a50-a4c8-d468c31067a8 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-baecb573-a3f1-42db-bc1b-788220753a7d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-baecb573-a3f1-42db-bc1b-788220753a7d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:38:25 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Updating instance_info_cache with network_info: [{"id": "79499d87-7e81-458e-ac87-ddde51eaefc6", "address": "fa:16:3e:21:9b:f7", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap79499d87-7e", "ovs_interfaceid": "79499d87-7e81-458e-ac87-ddde51eaefc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-baecb573-a3f1-42db-bc1b-788220753a7d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.network.neutron [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Successfully updated port: 02bf30f9-074c-4a50-a4c8-d468c31067a8 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquired lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.network.neutron [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.compute.manager [req-f608079e-d72a-4d12-9626-567135acd1b3 req-65ff97a8-5f77-42e7-aced-c6104e6a4054 service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Received event network-changed-02bf30f9-074c-4a50-a4c8-d468c31067a8 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.compute.manager [req-f608079e-d72a-4d12-9626-567135acd1b3 req-65ff97a8-5f77-42e7-aced-c6104e6a4054 service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Refreshing instance network info cache due to event network-changed-02bf30f9-074c-4a50-a4c8-d468c31067a8. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f608079e-d72a-4d12-9626-567135acd1b3 req-65ff97a8-5f77-42e7-aced-c6104e6a4054 service nova] Acquiring lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.network.neutron [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.network.neutron [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Updating instance_info_cache with network_info: [{"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Releasing lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.compute.manager [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Instance network_info: |[{"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f608079e-d72a-4d12-9626-567135acd1b3 req-65ff97a8-5f77-42e7-aced-c6104e6a4054 service nova] Acquired lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.network.neutron [req-f608079e-d72a-4d12-9626-567135acd1b3 req-65ff97a8-5f77-42e7-aced-c6104e6a4054 service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Refreshing network info cache for port 02bf30f9-074c-4a50-a4c8-d468c31067a8 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Start _get_guest_xml network_info=[{"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:38:26 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:38:26 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-307242091',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-307242091',id=17,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5bcb8e3930844068bd2b496914a4d764',ramdisk_id='',reservation_id='r-khwltncp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-134694710',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:38:25Z,user_data=None,user_id='1b7d729afb5942b8a1753ffaa4d5a268',uuid=4277833d-7119-4772-ba6d-dfa7368a652b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converting VIF {"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:07:da,bridge_name='br-int',has_traffic_filtering=True,id=02bf30f9-074c-4a50-a4c8-d468c31067a8,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02bf30f9-07') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.objects.instance [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lazy-loading 'pci_devices' on Instance uuid 4277833d-7119-4772-ba6d-dfa7368a652b {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] End _get_guest_xml xml= Apr 20 10:38:26 user nova-compute[71283]: 4277833d-7119-4772-ba6d-dfa7368a652b Apr 20 10:38:26 user nova-compute[71283]: instance-00000011 Apr 20 10:38:26 user nova-compute[71283]: 131072 Apr 20 10:38:26 user nova-compute[71283]: 1 Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: tempest-ServerBootFromVolumeStableRescueTest-server-307242091 Apr 20 10:38:26 user nova-compute[71283]: 2023-04-20 10:38:26 Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: 128 Apr 20 10:38:26 user nova-compute[71283]: 1 Apr 20 10:38:26 user nova-compute[71283]: 0 Apr 20 10:38:26 user nova-compute[71283]: 0 Apr 20 10:38:26 user nova-compute[71283]: 1 Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member Apr 20 10:38:26 user nova-compute[71283]: tempest-ServerBootFromVolumeStableRescueTest-134694710 Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: OpenStack Foundation Apr 20 10:38:26 user nova-compute[71283]: OpenStack Nova Apr 20 10:38:26 user nova-compute[71283]: 0.0.0 Apr 20 10:38:26 user nova-compute[71283]: 4277833d-7119-4772-ba6d-dfa7368a652b Apr 20 10:38:26 user nova-compute[71283]: 4277833d-7119-4772-ba6d-dfa7368a652b Apr 20 10:38:26 user nova-compute[71283]: Virtual Machine Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: hvm Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Nehalem Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: /dev/urandom Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: Apr 20 10:38:26 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-307242091',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-307242091',id=17,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5bcb8e3930844068bd2b496914a4d764',ramdisk_id='',reservation_id='r-khwltncp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-134694710',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:38:25Z,user_data=None,user_id='1b7d729afb5942b8a1753ffaa4d5a268',uuid=4277833d-7119-4772-ba6d-dfa7368a652b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converting VIF {"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:07:da,bridge_name='br-int',has_traffic_filtering=True,id=02bf30f9-074c-4a50-a4c8-d468c31067a8,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02bf30f9-07') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG os_vif [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:07:da,bridge_name='br-int',has_traffic_filtering=True,id=02bf30f9-074c-4a50-a4c8-d468c31067a8,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02bf30f9-07') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02bf30f9-07, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap02bf30f9-07, col_values=(('external_ids', {'iface-id': '02bf30f9-074c-4a50-a4c8-d468c31067a8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:07:da', 'vm-uuid': '4277833d-7119-4772-ba6d-dfa7368a652b'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:26 user nova-compute[71283]: INFO os_vif [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:07:da,bridge_name='br-int',has_traffic_filtering=True,id=02bf30f9-074c-4a50-a4c8-d468c31067a8,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02bf30f9-07') Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:38:26 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] No VIF found with MAC fa:16:3e:ee:07:da, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:38:27 user nova-compute[71283]: DEBUG nova.network.neutron [req-f608079e-d72a-4d12-9626-567135acd1b3 req-65ff97a8-5f77-42e7-aced-c6104e6a4054 service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Updated VIF entry in instance network info cache for port 02bf30f9-074c-4a50-a4c8-d468c31067a8. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:38:27 user nova-compute[71283]: DEBUG nova.network.neutron [req-f608079e-d72a-4d12-9626-567135acd1b3 req-65ff97a8-5f77-42e7-aced-c6104e6a4054 service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Updating instance_info_cache with network_info: [{"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:38:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f608079e-d72a-4d12-9626-567135acd1b3 req-65ff97a8-5f77-42e7-aced-c6104e6a4054 service nova] Releasing lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:38:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:28 user nova-compute[71283]: DEBUG nova.compute.manager [req-428b5992-0236-4905-9df3-b97c17346d4b req-811d5da1-af90-455c-8659-22eaeb567c3e service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Received event network-vif-plugged-02bf30f9-074c-4a50-a4c8-d468c31067a8 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:38:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-428b5992-0236-4905-9df3-b97c17346d4b req-811d5da1-af90-455c-8659-22eaeb567c3e service nova] Acquiring lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-428b5992-0236-4905-9df3-b97c17346d4b req-811d5da1-af90-455c-8659-22eaeb567c3e service nova] Lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-428b5992-0236-4905-9df3-b97c17346d4b req-811d5da1-af90-455c-8659-22eaeb567c3e service nova] Lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:28 user nova-compute[71283]: DEBUG nova.compute.manager [req-428b5992-0236-4905-9df3-b97c17346d4b req-811d5da1-af90-455c-8659-22eaeb567c3e service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] No waiting events found dispatching network-vif-plugged-02bf30f9-074c-4a50-a4c8-d468c31067a8 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:38:28 user nova-compute[71283]: WARNING nova.compute.manager [req-428b5992-0236-4905-9df3-b97c17346d4b req-811d5da1-af90-455c-8659-22eaeb567c3e service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Received unexpected event network-vif-plugged-02bf30f9-074c-4a50-a4c8-d468c31067a8 for instance with vm_state building and task_state spawning. Apr 20 10:38:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:29 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:38:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:38:30 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] VM Resumed (Lifecycle Event) Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:38:30 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Instance spawned successfully. Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:38:30 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:38:30 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] VM Started (Lifecycle Event) Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:38:30 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:38:30 user nova-compute[71283]: INFO nova.compute.manager [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Took 5.36 seconds to spawn the instance on the hypervisor. Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:38:30 user nova-compute[71283]: INFO nova.compute.manager [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Took 5.90 seconds to build instance. Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.compute.manager [req-cc77970a-1b99-41f8-95d6-819d29b2f818 req-4afe6a05-b9b3-4237-883e-c06c12f74b3a service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Received event network-vif-plugged-02bf30f9-074c-4a50-a4c8-d468c31067a8 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-cc77970a-1b99-41f8-95d6-819d29b2f818 req-4afe6a05-b9b3-4237-883e-c06c12f74b3a service nova] Acquiring lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-cc77970a-1b99-41f8-95d6-819d29b2f818 req-4afe6a05-b9b3-4237-883e-c06c12f74b3a service nova] Lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-cc77970a-1b99-41f8-95d6-819d29b2f818 req-4afe6a05-b9b3-4237-883e-c06c12f74b3a service nova] Lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:30 user nova-compute[71283]: DEBUG nova.compute.manager [req-cc77970a-1b99-41f8-95d6-819d29b2f818 req-4afe6a05-b9b3-4237-883e-c06c12f74b3a service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] No waiting events found dispatching network-vif-plugged-02bf30f9-074c-4a50-a4c8-d468c31067a8 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:38:30 user nova-compute[71283]: WARNING nova.compute.manager [req-cc77970a-1b99-41f8-95d6-819d29b2f818 req-4afe6a05-b9b3-4237-883e-c06c12f74b3a service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Received unexpected event network-vif-plugged-02bf30f9-074c-4a50-a4c8-d468c31067a8 for instance with vm_state active and task_state None. Apr 20 10:38:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-39bdd54c-be93-4f24-9920-b1bc617af088 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "4277833d-7119-4772-ba6d-dfa7368a652b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:31 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:31 user nova-compute[71283]: DEBUG nova.compute.manager [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:38:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:31 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:38:31 user nova-compute[71283]: INFO nova.compute.claims [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Claim successful on node user Apr 20 10:38:32 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.287s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG nova.network.neutron [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:38:32 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:38:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG nova.policy [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0a931240713e465197b96fa84574ba23', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cb2d81481a7d4ec1b28e5377055b3ed7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:38:32 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Creating image(s) Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "/opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "/opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "/opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.138s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.139s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk 1073741824" returned: 0 in 0.186s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.333s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:33 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.152s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:33 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Checking if we can resize image /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:38:33 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:38:33 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:38:33 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Cannot resize image /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:38:33 user nova-compute[71283]: DEBUG nova.objects.instance [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lazy-loading 'migration_context' on Instance uuid e1e6c1fe-e5bb-496f-90e8-c77962132b79 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:38:33 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:38:33 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Ensure instance console log exists: /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:38:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:33 user nova-compute[71283]: DEBUG nova.network.neutron [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Successfully created port: 3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.network.neutron [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Successfully updated port: 3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "refresh_cache-e1e6c1fe-e5bb-496f-90e8-c77962132b79" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquired lock "refresh_cache-e1e6c1fe-e5bb-496f-90e8-c77962132b79" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.network.neutron [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.compute.manager [req-671c76db-e4c1-46cf-a741-0db0af11f3ef req-eb580fcf-0a7c-4fdb-8263-dbabb9edc8b5 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Received event network-changed-3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.compute.manager [req-671c76db-e4c1-46cf-a741-0db0af11f3ef req-eb580fcf-0a7c-4fdb-8263-dbabb9edc8b5 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Refreshing instance network info cache due to event network-changed-3a540308-3038-45f9-89b1-b7039742888b. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-671c76db-e4c1-46cf-a741-0db0af11f3ef req-eb580fcf-0a7c-4fdb-8263-dbabb9edc8b5 service nova] Acquiring lock "refresh_cache-e1e6c1fe-e5bb-496f-90e8-c77962132b79" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.network.neutron [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.network.neutron [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Updating instance_info_cache with network_info: [{"id": "3a540308-3038-45f9-89b1-b7039742888b", "address": "fa:16:3e:82:13:6a", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a540308-30", "ovs_interfaceid": "3a540308-3038-45f9-89b1-b7039742888b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Releasing lock "refresh_cache-e1e6c1fe-e5bb-496f-90e8-c77962132b79" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Instance network_info: |[{"id": "3a540308-3038-45f9-89b1-b7039742888b", "address": "fa:16:3e:82:13:6a", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a540308-30", "ovs_interfaceid": "3a540308-3038-45f9-89b1-b7039742888b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-671c76db-e4c1-46cf-a741-0db0af11f3ef req-eb580fcf-0a7c-4fdb-8263-dbabb9edc8b5 service nova] Acquired lock "refresh_cache-e1e6c1fe-e5bb-496f-90e8-c77962132b79" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.network.neutron [req-671c76db-e4c1-46cf-a741-0db0af11f3ef req-eb580fcf-0a7c-4fdb-8263-dbabb9edc8b5 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Refreshing network info cache for port 3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Start _get_guest_xml network_info=[{"id": "3a540308-3038-45f9-89b1-b7039742888b", "address": "fa:16:3e:82:13:6a", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a540308-30", "ovs_interfaceid": "3a540308-3038-45f9-89b1-b7039742888b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:38:34 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:38:34 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:38:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-320923773',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-320923773',id=18,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLhpymhd3yO1Et+0eGAwsr/GXx7RZVrMrGei6a2cvPWCyKtNLDS8f5siOp4rn0Q/jHWAzkxE2ACk5TYmPElXFB6TEwAmLrBWzrznzaospNaBLAUvu71t3e1gkhL61SXmbQ==',key_name='tempest-keypair-890693079',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cb2d81481a7d4ec1b28e5377055b3ed7',ramdisk_id='',reservation_id='r-peph37r0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1878611374',owner_user_name='tempest-AttachVolumeShelveTestJSON-1878611374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:38:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a931240713e465197b96fa84574ba23',uuid=e1e6c1fe-e5bb-496f-90e8-c77962132b79,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a540308-3038-45f9-89b1-b7039742888b", "address": "fa:16:3e:82:13:6a", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a540308-30", "ovs_interfaceid": "3a540308-3038-45f9-89b1-b7039742888b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Converting VIF {"id": "3a540308-3038-45f9-89b1-b7039742888b", "address": "fa:16:3e:82:13:6a", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a540308-30", "ovs_interfaceid": "3a540308-3038-45f9-89b1-b7039742888b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:13:6a,bridge_name='br-int',has_traffic_filtering=True,id=3a540308-3038-45f9-89b1-b7039742888b,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a540308-30') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.objects.instance [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lazy-loading 'pci_devices' on Instance uuid e1e6c1fe-e5bb-496f-90e8-c77962132b79 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] End _get_guest_xml xml= Apr 20 10:38:34 user nova-compute[71283]: e1e6c1fe-e5bb-496f-90e8-c77962132b79 Apr 20 10:38:34 user nova-compute[71283]: instance-00000012 Apr 20 10:38:34 user nova-compute[71283]: 131072 Apr 20 10:38:34 user nova-compute[71283]: 1 Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: tempest-AttachVolumeShelveTestJSON-server-320923773 Apr 20 10:38:34 user nova-compute[71283]: 2023-04-20 10:38:34 Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: 128 Apr 20 10:38:34 user nova-compute[71283]: 1 Apr 20 10:38:34 user nova-compute[71283]: 0 Apr 20 10:38:34 user nova-compute[71283]: 0 Apr 20 10:38:34 user nova-compute[71283]: 1 Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: tempest-AttachVolumeShelveTestJSON-1878611374-project-member Apr 20 10:38:34 user nova-compute[71283]: tempest-AttachVolumeShelveTestJSON-1878611374 Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: OpenStack Foundation Apr 20 10:38:34 user nova-compute[71283]: OpenStack Nova Apr 20 10:38:34 user nova-compute[71283]: 0.0.0 Apr 20 10:38:34 user nova-compute[71283]: e1e6c1fe-e5bb-496f-90e8-c77962132b79 Apr 20 10:38:34 user nova-compute[71283]: e1e6c1fe-e5bb-496f-90e8-c77962132b79 Apr 20 10:38:34 user nova-compute[71283]: Virtual Machine Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: hvm Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Nehalem Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: /dev/urandom Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: Apr 20 10:38:34 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:38:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-320923773',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-320923773',id=18,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLhpymhd3yO1Et+0eGAwsr/GXx7RZVrMrGei6a2cvPWCyKtNLDS8f5siOp4rn0Q/jHWAzkxE2ACk5TYmPElXFB6TEwAmLrBWzrznzaospNaBLAUvu71t3e1gkhL61SXmbQ==',key_name='tempest-keypair-890693079',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cb2d81481a7d4ec1b28e5377055b3ed7',ramdisk_id='',reservation_id='r-peph37r0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1878611374',owner_user_name='tempest-AttachVolumeShelveTestJSON-1878611374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:38:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a931240713e465197b96fa84574ba23',uuid=e1e6c1fe-e5bb-496f-90e8-c77962132b79,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a540308-3038-45f9-89b1-b7039742888b", "address": "fa:16:3e:82:13:6a", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a540308-30", "ovs_interfaceid": "3a540308-3038-45f9-89b1-b7039742888b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Converting VIF {"id": "3a540308-3038-45f9-89b1-b7039742888b", "address": "fa:16:3e:82:13:6a", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a540308-30", "ovs_interfaceid": "3a540308-3038-45f9-89b1-b7039742888b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:13:6a,bridge_name='br-int',has_traffic_filtering=True,id=3a540308-3038-45f9-89b1-b7039742888b,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a540308-30') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG os_vif [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:13:6a,bridge_name='br-int',has_traffic_filtering=True,id=3a540308-3038-45f9-89b1-b7039742888b,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a540308-30') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a540308-30, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3a540308-30, col_values=(('external_ids', {'iface-id': '3a540308-3038-45f9-89b1-b7039742888b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:13:6a', 'vm-uuid': 'e1e6c1fe-e5bb-496f-90e8-c77962132b79'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:34 user nova-compute[71283]: INFO os_vif [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:13:6a,bridge_name='br-int',has_traffic_filtering=True,id=3a540308-3038-45f9-89b1-b7039742888b,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a540308-30') Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:38:34 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] No VIF found with MAC fa:16:3e:82:13:6a, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:38:35 user nova-compute[71283]: DEBUG nova.network.neutron [req-671c76db-e4c1-46cf-a741-0db0af11f3ef req-eb580fcf-0a7c-4fdb-8263-dbabb9edc8b5 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Updated VIF entry in instance network info cache for port 3a540308-3038-45f9-89b1-b7039742888b. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:38:35 user nova-compute[71283]: DEBUG nova.network.neutron [req-671c76db-e4c1-46cf-a741-0db0af11f3ef req-eb580fcf-0a7c-4fdb-8263-dbabb9edc8b5 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Updating instance_info_cache with network_info: [{"id": "3a540308-3038-45f9-89b1-b7039742888b", "address": "fa:16:3e:82:13:6a", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a540308-30", "ovs_interfaceid": "3a540308-3038-45f9-89b1-b7039742888b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:38:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-671c76db-e4c1-46cf-a741-0db0af11f3ef req-eb580fcf-0a7c-4fdb-8263-dbabb9edc8b5 service nova] Releasing lock "refresh_cache-e1e6c1fe-e5bb-496f-90e8-c77962132b79" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:38:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:36 user nova-compute[71283]: DEBUG nova.compute.manager [req-5aac5dd0-dab5-41c4-bfd9-2c7afda087bf req-53c6e195-5193-4454-9fc8-6c37ef34f7b3 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Received event network-vif-plugged-3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:38:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-5aac5dd0-dab5-41c4-bfd9-2c7afda087bf req-53c6e195-5193-4454-9fc8-6c37ef34f7b3 service nova] Acquiring lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-5aac5dd0-dab5-41c4-bfd9-2c7afda087bf req-53c6e195-5193-4454-9fc8-6c37ef34f7b3 service nova] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-5aac5dd0-dab5-41c4-bfd9-2c7afda087bf req-53c6e195-5193-4454-9fc8-6c37ef34f7b3 service nova] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:36 user nova-compute[71283]: DEBUG nova.compute.manager [req-5aac5dd0-dab5-41c4-bfd9-2c7afda087bf req-53c6e195-5193-4454-9fc8-6c37ef34f7b3 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] No waiting events found dispatching network-vif-plugged-3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:38:36 user nova-compute[71283]: WARNING nova.compute.manager [req-5aac5dd0-dab5-41c4-bfd9-2c7afda087bf req-53c6e195-5193-4454-9fc8-6c37ef34f7b3 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Received unexpected event network-vif-plugged-3a540308-3038-45f9-89b1-b7039742888b for instance with vm_state building and task_state spawning. Apr 20 10:38:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:38:38 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] VM Resumed (Lifecycle Event) Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:38:38 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Instance spawned successfully. Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:38:38 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:38:38 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] VM Started (Lifecycle Event) Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:38:38 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:38:38 user nova-compute[71283]: INFO nova.compute.manager [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Took 5.77 seconds to spawn the instance on the hypervisor. Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:38:38 user nova-compute[71283]: INFO nova.compute.manager [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Took 6.39 seconds to build instance. Apr 20 10:38:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-24fe6e8c-b229-4c7a-98b4-1f89d46c1c5e tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.478s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.compute.manager [req-c48e9598-e428-45e6-874d-c8eb76705303 req-73c18bff-4e29-447e-afec-29412850cd2d service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Received event network-vif-plugged-3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-c48e9598-e428-45e6-874d-c8eb76705303 req-73c18bff-4e29-447e-afec-29412850cd2d service nova] Acquiring lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-c48e9598-e428-45e6-874d-c8eb76705303 req-73c18bff-4e29-447e-afec-29412850cd2d service nova] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-c48e9598-e428-45e6-874d-c8eb76705303 req-73c18bff-4e29-447e-afec-29412850cd2d service nova] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:38:38 user nova-compute[71283]: DEBUG nova.compute.manager [req-c48e9598-e428-45e6-874d-c8eb76705303 req-73c18bff-4e29-447e-afec-29412850cd2d service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] No waiting events found dispatching network-vif-plugged-3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:38:38 user nova-compute[71283]: WARNING nova.compute.manager [req-c48e9598-e428-45e6-874d-c8eb76705303 req-73c18bff-4e29-447e-afec-29412850cd2d service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Received unexpected event network-vif-plugged-3a540308-3038-45f9-89b1-b7039742888b for instance with vm_state active and task_state None. Apr 20 10:38:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:38:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:06 user nova-compute[71283]: DEBUG nova.compute.manager [req-35b43a8f-5012-438d-9430-8d3ba9a7c6cb req-d2b4a407-37ae-4e5e-b378-c964507e8daf service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received event network-changed-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:39:06 user nova-compute[71283]: DEBUG nova.compute.manager [req-35b43a8f-5012-438d-9430-8d3ba9a7c6cb req-d2b4a407-37ae-4e5e-b378-c964507e8daf service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Refreshing instance network info cache due to event network-changed-79499d87-7e81-458e-ac87-ddde51eaefc6. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:39:06 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-35b43a8f-5012-438d-9430-8d3ba9a7c6cb req-d2b4a407-37ae-4e5e-b378-c964507e8daf service nova] Acquiring lock "refresh_cache-baecb573-a3f1-42db-bc1b-788220753a7d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:39:06 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-35b43a8f-5012-438d-9430-8d3ba9a7c6cb req-d2b4a407-37ae-4e5e-b378-c964507e8daf service nova] Acquired lock "refresh_cache-baecb573-a3f1-42db-bc1b-788220753a7d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:39:06 user nova-compute[71283]: DEBUG nova.network.neutron [req-35b43a8f-5012-438d-9430-8d3ba9a7c6cb req-d2b4a407-37ae-4e5e-b378-c964507e8daf service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Refreshing network info cache for port 79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:39:07 user nova-compute[71283]: DEBUG nova.network.neutron [req-35b43a8f-5012-438d-9430-8d3ba9a7c6cb req-d2b4a407-37ae-4e5e-b378-c964507e8daf service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Updated VIF entry in instance network info cache for port 79499d87-7e81-458e-ac87-ddde51eaefc6. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:39:07 user nova-compute[71283]: DEBUG nova.network.neutron [req-35b43a8f-5012-438d-9430-8d3ba9a7c6cb req-d2b4a407-37ae-4e5e-b378-c964507e8daf service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Updating instance_info_cache with network_info: [{"id": "79499d87-7e81-458e-ac87-ddde51eaefc6", "address": "fa:16:3e:21:9b:f7", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.17", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap79499d87-7e", "ovs_interfaceid": "79499d87-7e81-458e-ac87-ddde51eaefc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:39:07 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-35b43a8f-5012-438d-9430-8d3ba9a7c6cb req-d2b4a407-37ae-4e5e-b378-c964507e8daf service nova] Releasing lock "refresh_cache-baecb573-a3f1-42db-bc1b-788220753a7d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "baecb573-a3f1-42db-bc1b-788220753a7d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "baecb573-a3f1-42db-bc1b-788220753a7d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:08 user nova-compute[71283]: INFO nova.compute.manager [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Terminating instance Apr 20 10:39:08 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG nova.compute.manager [req-064c799d-4cde-4969-9000-24b0ac5683c7 req-0bc1c7be-3e42-4038-9a8a-8d805e7914c3 service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received event network-vif-unplugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-064c799d-4cde-4969-9000-24b0ac5683c7 req-0bc1c7be-3e42-4038-9a8a-8d805e7914c3 service nova] Acquiring lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-064c799d-4cde-4969-9000-24b0ac5683c7 req-0bc1c7be-3e42-4038-9a8a-8d805e7914c3 service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-064c799d-4cde-4969-9000-24b0ac5683c7 req-0bc1c7be-3e42-4038-9a8a-8d805e7914c3 service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG nova.compute.manager [req-064c799d-4cde-4969-9000-24b0ac5683c7 req-0bc1c7be-3e42-4038-9a8a-8d805e7914c3 service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] No waiting events found dispatching network-vif-unplugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG nova.compute.manager [req-064c799d-4cde-4969-9000-24b0ac5683c7 req-0bc1c7be-3e42-4038-9a8a-8d805e7914c3 service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received event network-vif-unplugged-79499d87-7e81-458e-ac87-ddde51eaefc6 for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:08 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Instance destroyed successfully. Apr 20 10:39:08 user nova-compute[71283]: DEBUG nova.objects.instance [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lazy-loading 'resources' on Instance uuid baecb573-a3f1-42db-bc1b-788220753a7d {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:37:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1283873387',display_name='tempest-AttachVolumeNegativeTest-server-1283873387',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1283873387',id=15,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJsFZ6zFqNNpK8NNtdhopwc6Wl3+Dy9fVkPXioOSVb5zIBlsJr3Vfpw4JSmE38Ewd24wugKZHZWDCHIcSgnyz2KZEoYLhZutGGub8KFvP0CYywEgGYJIN+RHNtlh3mNweQ==',key_name='tempest-keypair-58831496',keypairs=,launch_index=0,launched_at=2023-04-20T10:37:21Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='12e621999b00481c839affc4e83ce37c',ramdisk_id='',reservation_id='r-mav5ivfr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-1619866573',owner_user_name='tempest-AttachVolumeNegativeTest-1619866573-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:37:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a04be58cd354d508616edd9d5eeff54',uuid=baecb573-a3f1-42db-bc1b-788220753a7d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "79499d87-7e81-458e-ac87-ddde51eaefc6", "address": "fa:16:3e:21:9b:f7", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.17", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap79499d87-7e", "ovs_interfaceid": "79499d87-7e81-458e-ac87-ddde51eaefc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converting VIF {"id": "79499d87-7e81-458e-ac87-ddde51eaefc6", "address": "fa:16:3e:21:9b:f7", "network": {"id": "51b24d7b-a37b-453a-a335-3ba809f26953", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-453356843-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.17", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "12e621999b00481c839affc4e83ce37c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap79499d87-7e", "ovs_interfaceid": "79499d87-7e81-458e-ac87-ddde51eaefc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:39:08 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:21:9b:f7,bridge_name='br-int',has_traffic_filtering=True,id=79499d87-7e81-458e-ac87-ddde51eaefc6,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79499d87-7e') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG os_vif [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:9b:f7,bridge_name='br-int',has_traffic_filtering=True,id=79499d87-7e81-458e-ac87-ddde51eaefc6,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79499d87-7e') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79499d87-7e, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:39:09 user nova-compute[71283]: INFO os_vif [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:9b:f7,bridge_name='br-int',has_traffic_filtering=True,id=79499d87-7e81-458e-ac87-ddde51eaefc6,network=Network(51b24d7b-a37b-453a-a335-3ba809f26953),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79499d87-7e') Apr 20 10:39:09 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Deleting instance files /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d_del Apr 20 10:39:09 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Deletion of /opt/stack/data/nova/instances/baecb573-a3f1-42db-bc1b-788220753a7d_del complete Apr 20 10:39:09 user nova-compute[71283]: INFO nova.compute.manager [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Took 0.77 seconds to destroy the instance on the hypervisor. Apr 20 10:39:09 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:39:09 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Took 0.86 seconds to deallocate network for instance. Apr 20 10:39:09 user nova-compute[71283]: DEBUG nova.compute.manager [req-862960d5-980e-43f7-b3ed-9c5a75fe4908 req-2b31b428-0163-4768-bfd5-841946514847 service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received event network-vif-deleted-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:39:09 user nova-compute[71283]: INFO nova.compute.manager [req-862960d5-980e-43f7-b3ed-9c5a75fe4908 req-2b31b428-0163-4768-bfd5-841946514847 service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Neutron deleted interface 79499d87-7e81-458e-ac87-ddde51eaefc6; detaching it from the instance and deleting it from the info cache Apr 20 10:39:09 user nova-compute[71283]: DEBUG nova.network.neutron [req-862960d5-980e-43f7-b3ed-9c5a75fe4908 req-2b31b428-0163-4768-bfd5-841946514847 service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:39:09 user nova-compute[71283]: DEBUG nova.compute.manager [req-862960d5-980e-43f7-b3ed-9c5a75fe4908 req-2b31b428-0163-4768-bfd5-841946514847 service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Detach interface failed, port_id=79499d87-7e81-458e-ac87-ddde51eaefc6, reason: Instance baecb573-a3f1-42db-bc1b-788220753a7d could not be found. {{(pid=71283) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.182s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:10 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Deleted allocations for instance baecb573-a3f1-42db-bc1b-788220753a7d Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e07e4309-287e-4134-bb8d-87606aed676f tempest-AttachVolumeNegativeTest-1619866573 tempest-AttachVolumeNegativeTest-1619866573-project-member] Lock "baecb573-a3f1-42db-bc1b-788220753a7d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.042s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received event network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Acquiring lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] No waiting events found dispatching network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:39:10 user nova-compute[71283]: WARNING nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received unexpected event network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 for instance with vm_state deleted and task_state None. Apr 20 10:39:10 user nova-compute[71283]: DEBUG nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received event network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Acquiring lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] No waiting events found dispatching network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:39:10 user nova-compute[71283]: WARNING nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received unexpected event network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 for instance with vm_state deleted and task_state None. Apr 20 10:39:10 user nova-compute[71283]: DEBUG nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received event network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Acquiring lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] No waiting events found dispatching network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:39:10 user nova-compute[71283]: WARNING nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received unexpected event network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 for instance with vm_state deleted and task_state None. Apr 20 10:39:10 user nova-compute[71283]: DEBUG nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received event network-vif-unplugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Acquiring lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] No waiting events found dispatching network-vif-unplugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:39:10 user nova-compute[71283]: WARNING nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received unexpected event network-vif-unplugged-79499d87-7e81-458e-ac87-ddde51eaefc6 for instance with vm_state deleted and task_state None. Apr 20 10:39:10 user nova-compute[71283]: DEBUG nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received event network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Acquiring lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] Lock "baecb573-a3f1-42db-bc1b-788220753a7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:10 user nova-compute[71283]: DEBUG nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] No waiting events found dispatching network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:39:10 user nova-compute[71283]: WARNING nova.compute.manager [req-0e37d4f7-8961-45c3-b588-773a755f7f13 req-3ec38e1b-f29c-4228-b4b2-d4f77637c78b service nova] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Received unexpected event network-vif-plugged-79499d87-7e81-458e-ac87-ddde51eaefc6 for instance with vm_state deleted and task_state None. Apr 20 10:39:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:17 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:39:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:39:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:39:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:39:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:39:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 20 10:39:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] There are 0 instances to clean {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 20 10:39:22 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:39:22 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:39:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:22 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:39:22 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:39:22 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:39:22 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:39:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:39:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:39:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk --force-share --output=json" returned: 0 in 0.124s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:39:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:39:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:39:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:39:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:39:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:39:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:39:24 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] VM Stopped (Lifecycle Event) Apr 20 10:39:24 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:39:24 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:39:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=8806MB free_disk=26.460193634033203GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-ec4093b3-6c64-4a2c-a6f5-845c2f37af88 None None] [instance: baecb573-a3f1-42db-bc1b-788220753a7d] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance a8d3f0e1-b178-4197-b40a-8ebb2548432d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 4277833d-7119-4772-ba6d-dfa7368a652b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance e1e6c1fe-e5bb-496f-90e8-c77962132b79 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.225s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:25 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:39:25 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:39:26 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:39:26 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:39:26 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:39:26 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances with incomplete migration {{(pid=71283) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 20 10:39:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:39:27 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:39:27 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:39:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-a8d3f0e1-b178-4197-b40a-8ebb2548432d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:39:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-a8d3f0e1-b178-4197-b40a-8ebb2548432d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:39:27 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:39:27 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid a8d3f0e1-b178-4197-b40a-8ebb2548432d {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:39:28 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Updating instance_info_cache with network_info: [{"id": "4117b617-d8de-4936-aedb-a7d408fe0b34", "address": "fa:16:3e:c0:6a:4f", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4117b617-d8", "ovs_interfaceid": "4117b617-d8de-4936-aedb-a7d408fe0b34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:39:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-a8d3f0e1-b178-4197-b40a-8ebb2548432d" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:39:28 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:39:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:39:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:39:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:39:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:39:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:39:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:35 user nova-compute[71283]: INFO nova.compute.manager [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Terminating instance Apr 20 10:39:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:39:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG nova.compute.manager [req-5b22e728-3136-4aa2-b39b-005008ad3959 req-56570b9c-9408-49c2-af77-78f795cafed8 service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Received event network-vif-unplugged-4117b617-d8de-4936-aedb-a7d408fe0b34 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-5b22e728-3136-4aa2-b39b-005008ad3959 req-56570b9c-9408-49c2-af77-78f795cafed8 service nova] Acquiring lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-5b22e728-3136-4aa2-b39b-005008ad3959 req-56570b9c-9408-49c2-af77-78f795cafed8 service nova] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-5b22e728-3136-4aa2-b39b-005008ad3959 req-56570b9c-9408-49c2-af77-78f795cafed8 service nova] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG nova.compute.manager [req-5b22e728-3136-4aa2-b39b-005008ad3959 req-56570b9c-9408-49c2-af77-78f795cafed8 service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] No waiting events found dispatching network-vif-unplugged-4117b617-d8de-4936-aedb-a7d408fe0b34 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG nova.compute.manager [req-5b22e728-3136-4aa2-b39b-005008ad3959 req-56570b9c-9408-49c2-af77-78f795cafed8 service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Received event network-vif-unplugged-4117b617-d8de-4936-aedb-a7d408fe0b34 for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:39:36 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Instance destroyed successfully. Apr 20 10:39:36 user nova-compute[71283]: DEBUG nova.objects.instance [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lazy-loading 'resources' on Instance uuid a8d3f0e1-b178-4197-b40a-8ebb2548432d {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:37:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1705189616',display_name='tempest-TestMinimumBasicScenario-server-1705189616',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1705189616',id=16,image_ref='69262355-7a22-44a5-8ebf-cd108cd9fd5c',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO6n0xssgLNJ9YwnQ9V93NcaNq9fsgPjdGXY0hrReturf7mL9Oi05p0B44KoX3tqIuN1tQo+iN1yrvDbet7dn3Flu5MucjF+Z3ihqA8dypjS6Ft9XWhbqbaTjqdOQGkwAw==',key_name='tempest-TestMinimumBasicScenario-956051906',keypairs=,launch_index=0,launched_at=2023-04-20T10:37:49Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='3e5913ed7eb944ddae4b38e1d746e7b9',ramdisk_id='',reservation_id='r-u1dvr1tl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='69262355-7a22-44a5-8ebf-cd108cd9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-244367270',owner_user_name='tempest-TestMinimumBasicScenario-244367270-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:37:49Z,user_data=None,user_id='878178b690704d048c37c79d596e953b',uuid=a8d3f0e1-b178-4197-b40a-8ebb2548432d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4117b617-d8de-4936-aedb-a7d408fe0b34", "address": "fa:16:3e:c0:6a:4f", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4117b617-d8", "ovs_interfaceid": "4117b617-d8de-4936-aedb-a7d408fe0b34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Converting VIF {"id": "4117b617-d8de-4936-aedb-a7d408fe0b34", "address": "fa:16:3e:c0:6a:4f", "network": {"id": "0ebbf38f-49de-48cb-ab19-2af234cb93fc", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-649679020-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5913ed7eb944ddae4b38e1d746e7b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4117b617-d8", "ovs_interfaceid": "4117b617-d8de-4936-aedb-a7d408fe0b34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:6a:4f,bridge_name='br-int',has_traffic_filtering=True,id=4117b617-d8de-4936-aedb-a7d408fe0b34,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4117b617-d8') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG os_vif [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:6a:4f,bridge_name='br-int',has_traffic_filtering=True,id=4117b617-d8de-4936-aedb-a7d408fe0b34,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4117b617-d8') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4117b617-d8, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:39:36 user nova-compute[71283]: INFO os_vif [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:6a:4f,bridge_name='br-int',has_traffic_filtering=True,id=4117b617-d8de-4936-aedb-a7d408fe0b34,network=Network(0ebbf38f-49de-48cb-ab19-2af234cb93fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4117b617-d8') Apr 20 10:39:36 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Deleting instance files /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d_del Apr 20 10:39:36 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Deletion of /opt/stack/data/nova/instances/a8d3f0e1-b178-4197-b40a-8ebb2548432d_del complete Apr 20 10:39:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:36 user nova-compute[71283]: INFO nova.compute.manager [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 20 10:39:36 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:39:36 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:39:37 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:39:37 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Took 0.59 seconds to deallocate network for instance. Apr 20 10:39:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:37 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:39:37 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:39:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.158s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:37 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Deleted allocations for instance a8d3f0e1-b178-4197-b40a-8ebb2548432d Apr 20 10:39:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-78db93a4-bbc3-416f-982a-b0a8b1d2b358 tempest-TestMinimumBasicScenario-244367270 tempest-TestMinimumBasicScenario-244367270-project-member] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.589s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:38 user nova-compute[71283]: DEBUG nova.compute.manager [req-20fdb219-ba26-4e81-9a94-de8192aea508 req-3d25f1e8-7790-42d9-ad58-d6e153e7987c service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Received event network-vif-plugged-4117b617-d8de-4936-aedb-a7d408fe0b34 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:39:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-20fdb219-ba26-4e81-9a94-de8192aea508 req-3d25f1e8-7790-42d9-ad58-d6e153e7987c service nova] Acquiring lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:39:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-20fdb219-ba26-4e81-9a94-de8192aea508 req-3d25f1e8-7790-42d9-ad58-d6e153e7987c service nova] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:39:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-20fdb219-ba26-4e81-9a94-de8192aea508 req-3d25f1e8-7790-42d9-ad58-d6e153e7987c service nova] Lock "a8d3f0e1-b178-4197-b40a-8ebb2548432d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:39:38 user nova-compute[71283]: DEBUG nova.compute.manager [req-20fdb219-ba26-4e81-9a94-de8192aea508 req-3d25f1e8-7790-42d9-ad58-d6e153e7987c service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] No waiting events found dispatching network-vif-plugged-4117b617-d8de-4936-aedb-a7d408fe0b34 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:39:38 user nova-compute[71283]: WARNING nova.compute.manager [req-20fdb219-ba26-4e81-9a94-de8192aea508 req-3d25f1e8-7790-42d9-ad58-d6e153e7987c service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Received unexpected event network-vif-plugged-4117b617-d8de-4936-aedb-a7d408fe0b34 for instance with vm_state deleted and task_state None. Apr 20 10:39:38 user nova-compute[71283]: DEBUG nova.compute.manager [req-20fdb219-ba26-4e81-9a94-de8192aea508 req-3d25f1e8-7790-42d9-ad58-d6e153e7987c service nova] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Received event network-vif-deleted-4117b617-d8de-4936-aedb-a7d408fe0b34 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:39:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:39:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:51 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:39:51 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] VM Stopped (Lifecycle Event) Apr 20 10:39:51 user nova-compute[71283]: DEBUG nova.compute.manager [None req-3c08a2fb-ecdb-4e45-b81e-4f1fc7a6d215 None None] [instance: a8d3f0e1-b178-4197-b40a-8ebb2548432d] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:39:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:39:56 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:40:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:40:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:40:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:40:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:15 user nova-compute[71283]: DEBUG nova.compute.manager [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:40:15 user nova-compute[71283]: INFO nova.compute.manager [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] instance snapshotting Apr 20 10:40:15 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Beginning live snapshot process Apr 20 10:40:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json -f qcow2 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:40:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json -f qcow2" returned: 0 in 0.143s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:40:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json -f qcow2 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:40:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json -f qcow2" returned: 0 in 0.146s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:40:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:40:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.141s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:40:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpo09mnue6/ec3c021cdef24c57894328fde0a53629.delta 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:40:16 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpo09mnue6/ec3c021cdef24c57894328fde0a53629.delta 1073741824" returned: 0 in 0.048s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:40:16 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Quiescing instance not available: QEMU guest agent is not enabled. Apr 20 10:40:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:40:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:40:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:40:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:40:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.guest [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71283) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 20 10:40:17 user nova-compute[71283]: DEBUG nova.virt.libvirt.guest [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71283) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 20 10:40:17 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 20 10:40:17 user nova-compute[71283]: DEBUG nova.privsep.utils [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71283) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 10:40:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpo09mnue6/ec3c021cdef24c57894328fde0a53629.delta /opt/stack/data/nova/instances/snapshots/tmpo09mnue6/ec3c021cdef24c57894328fde0a53629 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:40:17 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpo09mnue6/ec3c021cdef24c57894328fde0a53629.delta /opt/stack/data/nova/instances/snapshots/tmpo09mnue6/ec3c021cdef24c57894328fde0a53629" returned: 0 in 0.203s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:40:17 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Snapshot extracted, beginning image upload Apr 20 10:40:19 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:40:19 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:40:19 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Snapshot image upload complete Apr 20 10:40:19 user nova-compute[71283]: INFO nova.compute.manager [None req-98da34b0-27c2-4f18-bf2e-e302293e3be3 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Took 4.31 seconds to snapshot the instance on the hypervisor. Apr 20 10:40:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:40:22 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:40:22 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:40:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:40:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:40:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:40:22 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:40:22 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:40:22 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:40:22 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG nova.compute.manager [req-31557b91-804b-4280-92c7-e29e8b359df2 req-e256066a-b09e-4684-8c6a-75e0061c1718 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Received event network-changed-3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG nova.compute.manager [req-31557b91-804b-4280-92c7-e29e8b359df2 req-e256066a-b09e-4684-8c6a-75e0061c1718 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Refreshing instance network info cache due to event network-changed-3a540308-3038-45f9-89b1-b7039742888b. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-31557b91-804b-4280-92c7-e29e8b359df2 req-e256066a-b09e-4684-8c6a-75e0061c1718 service nova] Acquiring lock "refresh_cache-e1e6c1fe-e5bb-496f-90e8-c77962132b79" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-31557b91-804b-4280-92c7-e29e8b359df2 req-e256066a-b09e-4684-8c6a-75e0061c1718 service nova] Acquired lock "refresh_cache-e1e6c1fe-e5bb-496f-90e8-c77962132b79" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG nova.network.neutron [req-31557b91-804b-4280-92c7-e29e8b359df2 req-e256066a-b09e-4684-8c6a-75e0061c1718 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Refreshing network info cache for port 3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:40:23 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:40:23 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:40:23 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=8964MB free_disk=26.433185577392578GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG nova.network.neutron [req-31557b91-804b-4280-92c7-e29e8b359df2 req-e256066a-b09e-4684-8c6a-75e0061c1718 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Updated VIF entry in instance network info cache for port 3a540308-3038-45f9-89b1-b7039742888b. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:40:23 user nova-compute[71283]: DEBUG nova.network.neutron [req-31557b91-804b-4280-92c7-e29e8b359df2 req-e256066a-b09e-4684-8c6a-75e0061c1718 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Updating instance_info_cache with network_info: [{"id": "3a540308-3038-45f9-89b1-b7039742888b", "address": "fa:16:3e:82:13:6a", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a540308-30", "ovs_interfaceid": "3a540308-3038-45f9-89b1-b7039742888b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 4277833d-7119-4772-ba6d-dfa7368a652b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance e1e6c1fe-e5bb-496f-90e8-c77962132b79 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-31557b91-804b-4280-92c7-e29e8b359df2 req-e256066a-b09e-4684-8c6a-75e0061c1718 service nova] Releasing lock "refresh_cache-e1e6c1fe-e5bb-496f-90e8-c77962132b79" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing inventories for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating ProviderTree inventory for provider bdbc83bd-9307-4e20-8e3d-430b77499399 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating inventory in ProviderTree for provider bdbc83bd-9307-4e20-8e3d-430b77499399 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing aggregate associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, aggregates: None {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing trait associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.534s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:40:24 user nova-compute[71283]: INFO nova.compute.manager [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Terminating instance Apr 20 10:40:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG nova.compute.manager [req-a8420921-f5ac-4cdd-977f-dcf55ddc30da req-e43e9d94-f60a-42c5-af1f-f7c19102daef service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Received event network-vif-unplugged-3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a8420921-f5ac-4cdd-977f-dcf55ddc30da req-e43e9d94-f60a-42c5-af1f-f7c19102daef service nova] Acquiring lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a8420921-f5ac-4cdd-977f-dcf55ddc30da req-e43e9d94-f60a-42c5-af1f-f7c19102daef service nova] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a8420921-f5ac-4cdd-977f-dcf55ddc30da req-e43e9d94-f60a-42c5-af1f-f7c19102daef service nova] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG nova.compute.manager [req-a8420921-f5ac-4cdd-977f-dcf55ddc30da req-e43e9d94-f60a-42c5-af1f-f7c19102daef service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] No waiting events found dispatching network-vif-unplugged-3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG nova.compute.manager [req-a8420921-f5ac-4cdd-977f-dcf55ddc30da req-e43e9d94-f60a-42c5-af1f-f7c19102daef service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Received event network-vif-unplugged-3a540308-3038-45f9-89b1-b7039742888b for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:40:25 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Instance destroyed successfully. Apr 20 10:40:25 user nova-compute[71283]: DEBUG nova.objects.instance [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lazy-loading 'resources' on Instance uuid e1e6c1fe-e5bb-496f-90e8-c77962132b79 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:38:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-320923773',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-320923773',id=18,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLhpymhd3yO1Et+0eGAwsr/GXx7RZVrMrGei6a2cvPWCyKtNLDS8f5siOp4rn0Q/jHWAzkxE2ACk5TYmPElXFB6TEwAmLrBWzrznzaospNaBLAUvu71t3e1gkhL61SXmbQ==',key_name='tempest-keypair-890693079',keypairs=,launch_index=0,launched_at=2023-04-20T10:38:38Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='cb2d81481a7d4ec1b28e5377055b3ed7',ramdisk_id='',reservation_id='r-peph37r0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1878611374',owner_user_name='tempest-AttachVolumeShelveTestJSON-1878611374-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:38:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a931240713e465197b96fa84574ba23',uuid=e1e6c1fe-e5bb-496f-90e8-c77962132b79,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3a540308-3038-45f9-89b1-b7039742888b", "address": "fa:16:3e:82:13:6a", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a540308-30", "ovs_interfaceid": "3a540308-3038-45f9-89b1-b7039742888b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Converting VIF {"id": "3a540308-3038-45f9-89b1-b7039742888b", "address": "fa:16:3e:82:13:6a", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a540308-30", "ovs_interfaceid": "3a540308-3038-45f9-89b1-b7039742888b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:13:6a,bridge_name='br-int',has_traffic_filtering=True,id=3a540308-3038-45f9-89b1-b7039742888b,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a540308-30') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG os_vif [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:13:6a,bridge_name='br-int',has_traffic_filtering=True,id=3a540308-3038-45f9-89b1-b7039742888b,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a540308-30') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a540308-30, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:40:25 user nova-compute[71283]: INFO os_vif [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:13:6a,bridge_name='br-int',has_traffic_filtering=True,id=3a540308-3038-45f9-89b1-b7039742888b,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a540308-30') Apr 20 10:40:25 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Deleting instance files /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79_del Apr 20 10:40:25 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Deletion of /opt/stack/data/nova/instances/e1e6c1fe-e5bb-496f-90e8-c77962132b79_del complete Apr 20 10:40:25 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:40:25 user nova-compute[71283]: INFO nova.compute.manager [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Took 0.70 seconds to destroy the instance on the hypervisor. Apr 20 10:40:25 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:40:25 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:40:26 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:40:26 user nova-compute[71283]: DEBUG nova.compute.manager [req-8fcae586-9025-41bb-8564-7f4fb8d2cab9 req-4736675e-9eb4-4389-94d0-25fe2ed2b49b service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Received event network-vif-deleted-3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:40:26 user nova-compute[71283]: INFO nova.compute.manager [req-8fcae586-9025-41bb-8564-7f4fb8d2cab9 req-4736675e-9eb4-4389-94d0-25fe2ed2b49b service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Neutron deleted interface 3a540308-3038-45f9-89b1-b7039742888b; detaching it from the instance and deleting it from the info cache Apr 20 10:40:26 user nova-compute[71283]: DEBUG nova.network.neutron [req-8fcae586-9025-41bb-8564-7f4fb8d2cab9 req-4736675e-9eb4-4389-94d0-25fe2ed2b49b service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:40:26 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Took 0.80 seconds to deallocate network for instance. Apr 20 10:40:26 user nova-compute[71283]: DEBUG nova.compute.manager [req-8fcae586-9025-41bb-8564-7f4fb8d2cab9 req-4736675e-9eb4-4389-94d0-25fe2ed2b49b service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Detach interface failed, port_id=3a540308-3038-45f9-89b1-b7039742888b, reason: Instance e1e6c1fe-e5bb-496f-90e8-c77962132b79 could not be found. {{(pid=71283) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 20 10:40:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:40:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:40:26 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:40:26 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:40:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.136s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:40:26 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Deleted allocations for instance e1e6c1fe-e5bb-496f-90e8-c77962132b79 Apr 20 10:40:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-355e8166-fc8a-4e20-bb5d-fb16b0278a13 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.829s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:40:26 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:40:26 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:40:27 user nova-compute[71283]: DEBUG nova.compute.manager [req-41c1b99e-4bea-4ba6-94a1-08ab099578b8 req-9ce1abff-ccab-4fb0-804d-d1c822be62a9 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Received event network-vif-plugged-3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:40:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-41c1b99e-4bea-4ba6-94a1-08ab099578b8 req-9ce1abff-ccab-4fb0-804d-d1c822be62a9 service nova] Acquiring lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:40:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-41c1b99e-4bea-4ba6-94a1-08ab099578b8 req-9ce1abff-ccab-4fb0-804d-d1c822be62a9 service nova] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:40:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-41c1b99e-4bea-4ba6-94a1-08ab099578b8 req-9ce1abff-ccab-4fb0-804d-d1c822be62a9 service nova] Lock "e1e6c1fe-e5bb-496f-90e8-c77962132b79-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:40:27 user nova-compute[71283]: DEBUG nova.compute.manager [req-41c1b99e-4bea-4ba6-94a1-08ab099578b8 req-9ce1abff-ccab-4fb0-804d-d1c822be62a9 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] No waiting events found dispatching network-vif-plugged-3a540308-3038-45f9-89b1-b7039742888b {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:40:27 user nova-compute[71283]: WARNING nova.compute.manager [req-41c1b99e-4bea-4ba6-94a1-08ab099578b8 req-9ce1abff-ccab-4fb0-804d-d1c822be62a9 service nova] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Received unexpected event network-vif-plugged-3a540308-3038-45f9-89b1-b7039742888b for instance with vm_state deleted and task_state None. Apr 20 10:40:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:40:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:29 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:40:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:40:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:40:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:40:29 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:40:30 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Updating instance_info_cache with network_info: [{"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:40:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:40:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:40:30 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:31 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:33 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:40:35 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:40 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:40:40 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] VM Stopped (Lifecycle Event) Apr 20 10:40:40 user nova-compute[71283]: DEBUG nova.compute.manager [None req-26bf3206-10fe-4fae-8ad8-f47a7e52d979 None None] [instance: e1e6c1fe-e5bb-496f-90e8-c77962132b79] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:40:40 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:40:45 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:50 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:55 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:40:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:05 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:41:10 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:41:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:41:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:41:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:41:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:41:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:41:19 user nova-compute[71283]: INFO nova.compute.claims [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Claim successful on node user Apr 20 10:41:19 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.249s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG nova.network.neutron [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:41:19 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:41:19 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG nova.policy [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0a931240713e465197b96fa84574ba23', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cb2d81481a7d4ec1b28e5377055b3ed7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:41:19 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Creating image(s) Apr 20 10:41:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "/opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "/opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "/opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.133s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:41:19 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.137s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk 1073741824" returned: 0 in 0.044s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.187s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.126s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Checking if we can resize image /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk --force-share --output=json" returned: 0 in 0.159s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Cannot resize image /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG nova.objects.instance [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lazy-loading 'migration_context' on Instance uuid 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Ensure instance console log exists: /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG nova.network.neutron [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Successfully created port: 1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:41:20 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.network.neutron [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Successfully updated port: 1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "refresh_cache-353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquired lock "refresh_cache-353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.network.neutron [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.compute.manager [req-f3d20162-b675-4ac6-b077-4a7e0ecb54b9 req-23817756-5328-40b9-9582-9163307568b8 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Received event network-changed-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.compute.manager [req-f3d20162-b675-4ac6-b077-4a7e0ecb54b9 req-23817756-5328-40b9-9582-9163307568b8 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Refreshing instance network info cache due to event network-changed-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f3d20162-b675-4ac6-b077-4a7e0ecb54b9 req-23817756-5328-40b9-9582-9163307568b8 service nova] Acquiring lock "refresh_cache-353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.network.neutron [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.network.neutron [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Updating instance_info_cache with network_info: [{"id": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "address": "fa:16:3e:2b:49:99", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cf8ddec-3f", "ovs_interfaceid": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Releasing lock "refresh_cache-353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Instance network_info: |[{"id": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "address": "fa:16:3e:2b:49:99", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cf8ddec-3f", "ovs_interfaceid": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f3d20162-b675-4ac6-b077-4a7e0ecb54b9 req-23817756-5328-40b9-9582-9163307568b8 service nova] Acquired lock "refresh_cache-353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.network.neutron [req-f3d20162-b675-4ac6-b077-4a7e0ecb54b9 req-23817756-5328-40b9-9582-9163307568b8 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Refreshing network info cache for port 1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Start _get_guest_xml network_info=[{"id": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "address": "fa:16:3e:2b:49:99", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cf8ddec-3f", "ovs_interfaceid": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:41:21 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:41:21 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:41:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1678979542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1678979542',id=19,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNGhoArKgEyYE7cjbkuljf5dfqWvULtvRqXeDkd3yU0VFytwA0oQTi1WmPBfETn5tfNR5w6ja8z1Fe252sgoGIglYuGRFBV1xpFytKMJ6xdB6Euq8vN0T70qYh0FRstlFQ==',key_name='tempest-keypair-1124108181',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cb2d81481a7d4ec1b28e5377055b3ed7',ramdisk_id='',reservation_id='r-t0ge5vja',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1878611374',owner_user_name='tempest-AttachVolumeShelveTestJSON-1878611374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:41:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a931240713e465197b96fa84574ba23',uuid=353f8fcd-6f6e-43e8-b46c-0258d7a6a17b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "address": "fa:16:3e:2b:49:99", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cf8ddec-3f", "ovs_interfaceid": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Converting VIF {"id": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "address": "fa:16:3e:2b:49:99", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cf8ddec-3f", "ovs_interfaceid": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:49:99,bridge_name='br-int',has_traffic_filtering=True,id=1cf8ddec-3f1f-4db3-9dfc-fb761edd6929,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cf8ddec-3f') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.objects.instance [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lazy-loading 'pci_devices' on Instance uuid 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] End _get_guest_xml xml= Apr 20 10:41:21 user nova-compute[71283]: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b Apr 20 10:41:21 user nova-compute[71283]: instance-00000013 Apr 20 10:41:21 user nova-compute[71283]: 131072 Apr 20 10:41:21 user nova-compute[71283]: 1 Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: tempest-AttachVolumeShelveTestJSON-server-1678979542 Apr 20 10:41:21 user nova-compute[71283]: 2023-04-20 10:41:21 Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: 128 Apr 20 10:41:21 user nova-compute[71283]: 1 Apr 20 10:41:21 user nova-compute[71283]: 0 Apr 20 10:41:21 user nova-compute[71283]: 0 Apr 20 10:41:21 user nova-compute[71283]: 1 Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: tempest-AttachVolumeShelveTestJSON-1878611374-project-member Apr 20 10:41:21 user nova-compute[71283]: tempest-AttachVolumeShelveTestJSON-1878611374 Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: OpenStack Foundation Apr 20 10:41:21 user nova-compute[71283]: OpenStack Nova Apr 20 10:41:21 user nova-compute[71283]: 0.0.0 Apr 20 10:41:21 user nova-compute[71283]: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b Apr 20 10:41:21 user nova-compute[71283]: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b Apr 20 10:41:21 user nova-compute[71283]: Virtual Machine Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: hvm Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Nehalem Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: /dev/urandom Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: Apr 20 10:41:21 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:41:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1678979542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1678979542',id=19,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNGhoArKgEyYE7cjbkuljf5dfqWvULtvRqXeDkd3yU0VFytwA0oQTi1WmPBfETn5tfNR5w6ja8z1Fe252sgoGIglYuGRFBV1xpFytKMJ6xdB6Euq8vN0T70qYh0FRstlFQ==',key_name='tempest-keypair-1124108181',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cb2d81481a7d4ec1b28e5377055b3ed7',ramdisk_id='',reservation_id='r-t0ge5vja',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1878611374',owner_user_name='tempest-AttachVolumeShelveTestJSON-1878611374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:41:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a931240713e465197b96fa84574ba23',uuid=353f8fcd-6f6e-43e8-b46c-0258d7a6a17b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "address": "fa:16:3e:2b:49:99", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cf8ddec-3f", "ovs_interfaceid": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Converting VIF {"id": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "address": "fa:16:3e:2b:49:99", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cf8ddec-3f", "ovs_interfaceid": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:49:99,bridge_name='br-int',has_traffic_filtering=True,id=1cf8ddec-3f1f-4db3-9dfc-fb761edd6929,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cf8ddec-3f') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG os_vif [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:49:99,bridge_name='br-int',has_traffic_filtering=True,id=1cf8ddec-3f1f-4db3-9dfc-fb761edd6929,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cf8ddec-3f') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1cf8ddec-3f, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1cf8ddec-3f, col_values=(('external_ids', {'iface-id': '1cf8ddec-3f1f-4db3-9dfc-fb761edd6929', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:49:99', 'vm-uuid': '353f8fcd-6f6e-43e8-b46c-0258d7a6a17b'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:41:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:21 user nova-compute[71283]: INFO os_vif [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:49:99,bridge_name='br-int',has_traffic_filtering=True,id=1cf8ddec-3f1f-4db3-9dfc-fb761edd6929,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cf8ddec-3f') Apr 20 10:41:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] No VIF found with MAC fa:16:3e:2b:49:99, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG nova.network.neutron [req-f3d20162-b675-4ac6-b077-4a7e0ecb54b9 req-23817756-5328-40b9-9582-9163307568b8 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Updated VIF entry in instance network info cache for port 1cf8ddec-3f1f-4db3-9dfc-fb761edd6929. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG nova.network.neutron [req-f3d20162-b675-4ac6-b077-4a7e0ecb54b9 req-23817756-5328-40b9-9582-9163307568b8 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Updating instance_info_cache with network_info: [{"id": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "address": "fa:16:3e:2b:49:99", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cf8ddec-3f", "ovs_interfaceid": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f3d20162-b675-4ac6-b077-4a7e0ecb54b9 req-23817756-5328-40b9-9582-9163307568b8 service nova] Releasing lock "refresh_cache-353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "3d5b6824-153d-43ab-b274-38d733da2664" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "3d5b6824-153d-43ab-b274-38d733da2664" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:41:22 user nova-compute[71283]: INFO nova.compute.claims [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Claim successful on node user Apr 20 10:41:22 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.247s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:41:22 user nova-compute[71283]: DEBUG nova.network.neutron [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:41:22 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:41:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:41:22 user nova-compute[71283]: INFO nova.virt.block_device [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Booting with blank volume at /dev/vda Apr 20 10:41:22 user nova-compute[71283]: DEBUG nova.policy [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1b7d729afb5942b8a1753ffaa4d5a268', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5bcb8e3930844068bd2b496914a4d764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:41:23 user nova-compute[71283]: WARNING nova.compute.manager [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Volume id: a7649a67-45bb-48f9-a19c-57fc0cd1a24b finished being created but its status is error. Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume a7649a67-45bb-48f9-a19c-57fc0cd1a24b did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Traceback (most recent call last): Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] driver_block_device.attach_block_devices( Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] _log_and_attach(device) Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] bdm.attach(*attach_args, **attach_kwargs) Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] File "/opt/stack/nova/nova/virt/block_device.py", line 848, in attach Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] self.volume_id, self.attachment_id = self._create_volume( Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] with excutils.save_and_reraise_exception(): Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] self.force_reraise() Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] raise self.value Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] wait_func(context, volume_id) Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] nova.exception.VolumeNotCreated: Volume a7649a67-45bb-48f9-a19c-57fc0cd1a24b did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 20 10:41:23 user nova-compute[71283]: ERROR nova.compute.manager [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Apr 20 10:41:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:23 user nova-compute[71283]: DEBUG nova.network.neutron [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Successfully created port: a5802bfe-63ff-426d-b2af-149698ad1a21 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:41:23 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:41:23 user nova-compute[71283]: DEBUG nova.compute.manager [req-8479f53c-0641-485b-a910-b1c8b8187478 req-c2633b5a-8dd8-4072-afe2-a00cf1fa720f service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Received event network-vif-plugged-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:41:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8479f53c-0641-485b-a910-b1c8b8187478 req-c2633b5a-8dd8-4072-afe2-a00cf1fa720f service nova] Acquiring lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:41:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8479f53c-0641-485b-a910-b1c8b8187478 req-c2633b5a-8dd8-4072-afe2-a00cf1fa720f service nova] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:41:23 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-8479f53c-0641-485b-a910-b1c8b8187478 req-c2633b5a-8dd8-4072-afe2-a00cf1fa720f service nova] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:41:23 user nova-compute[71283]: DEBUG nova.compute.manager [req-8479f53c-0641-485b-a910-b1c8b8187478 req-c2633b5a-8dd8-4072-afe2-a00cf1fa720f service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] No waiting events found dispatching network-vif-plugged-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:41:23 user nova-compute[71283]: WARNING nova.compute.manager [req-8479f53c-0641-485b-a910-b1c8b8187478 req-c2633b5a-8dd8-4072-afe2-a00cf1fa720f service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Received unexpected event network-vif-plugged-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 for instance with vm_state building and task_state spawning. Apr 20 10:41:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.network.neutron [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Successfully updated port: a5802bfe-63ff-426d-b2af-149698ad1a21 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "refresh_cache-3d5b6824-153d-43ab-b274-38d733da2664" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquired lock "refresh_cache-3d5b6824-153d-43ab-b274-38d733da2664" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.network.neutron [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-abc8e3a3-71bb-49b7-ad97-7ccea3a51053 req-3b54f8ee-09b9-4d6c-bc08-2259ff5a97a1 service nova] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Received event network-changed-a5802bfe-63ff-426d-b2af-149698ad1a21 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.compute.manager [req-abc8e3a3-71bb-49b7-ad97-7ccea3a51053 req-3b54f8ee-09b9-4d6c-bc08-2259ff5a97a1 service nova] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Refreshing instance network info cache due to event network-changed-a5802bfe-63ff-426d-b2af-149698ad1a21. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-abc8e3a3-71bb-49b7-ad97-7ccea3a51053 req-3b54f8ee-09b9-4d6c-bc08-2259ff5a97a1 service nova] Acquiring lock "refresh_cache-3d5b6824-153d-43ab-b274-38d733da2664" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.network.neutron [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.network.neutron [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Updating instance_info_cache with network_info: [{"id": "a5802bfe-63ff-426d-b2af-149698ad1a21", "address": "fa:16:3e:fe:c4:80", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5802bfe-63", "ovs_interfaceid": "a5802bfe-63ff-426d-b2af-149698ad1a21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Releasing lock "refresh_cache-3d5b6824-153d-43ab-b274-38d733da2664" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Instance network_info: |[{"id": "a5802bfe-63ff-426d-b2af-149698ad1a21", "address": "fa:16:3e:fe:c4:80", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5802bfe-63", "ovs_interfaceid": "a5802bfe-63ff-426d-b2af-149698ad1a21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-abc8e3a3-71bb-49b7-ad97-7ccea3a51053 req-3b54f8ee-09b9-4d6c-bc08-2259ff5a97a1 service nova] Acquired lock "refresh_cache-3d5b6824-153d-43ab-b274-38d733da2664" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.network.neutron [req-abc8e3a3-71bb-49b7-ad97-7ccea3a51053 req-3b54f8ee-09b9-4d6c-bc08-2259ff5a97a1 service nova] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Refreshing network info cache for port a5802bfe-63ff-426d-b2af-149698ad1a21 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.compute.claims [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Aborting claim: {{(pid=71283) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.211s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Build of instance 3d5b6824-153d-43ab-b274-38d733da2664 aborted: Volume a7649a67-45bb-48f9-a19c-57fc0cd1a24b did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.compute.utils [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Build of instance 3d5b6824-153d-43ab-b274-38d733da2664 aborted: Volume a7649a67-45bb-48f9-a19c-57fc0cd1a24b did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71283) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.110s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:41:24 user nova-compute[71283]: ERROR nova.compute.manager [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Build of instance 3d5b6824-153d-43ab-b274-38d733da2664 aborted: Volume a7649a67-45bb-48f9-a19c-57fc0cd1a24b did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance 3d5b6824-153d-43ab-b274-38d733da2664 aborted: Volume a7649a67-45bb-48f9-a19c-57fc0cd1a24b did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Unplugging VIFs for instance {{(pid=71283) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:41:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-172044684',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-172044684',id=20,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5bcb8e3930844068bd2b496914a4d764',ramdisk_id='',reservation_id='r-p9wzfa0e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-134694710',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:41:23Z,user_data=None,user_id='1b7d729afb5942b8a1753ffaa4d5a268',uuid=3d5b6824-153d-43ab-b274-38d733da2664,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5802bfe-63ff-426d-b2af-149698ad1a21", "address": "fa:16:3e:fe:c4:80", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5802bfe-63", "ovs_interfaceid": "a5802bfe-63ff-426d-b2af-149698ad1a21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converting VIF {"id": "a5802bfe-63ff-426d-b2af-149698ad1a21", "address": "fa:16:3e:fe:c4:80", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5802bfe-63", "ovs_interfaceid": "a5802bfe-63ff-426d-b2af-149698ad1a21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:c4:80,bridge_name='br-int',has_traffic_filtering=True,id=a5802bfe-63ff-426d-b2af-149698ad1a21,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5802bfe-63') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG os_vif [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:c4:80,bridge_name='br-int',has_traffic_filtering=True,id=a5802bfe-63ff-426d-b2af-149698ad1a21,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5802bfe-63') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa5802bfe-63, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:41:24 user nova-compute[71283]: INFO os_vif [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:c4:80,bridge_name='br-int',has_traffic_filtering=True,id=a5802bfe-63ff-426d-b2af-149698ad1a21,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5802bfe-63') Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Unplugged VIFs for instance {{(pid=71283) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.compute.manager [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG nova.network.neutron [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:41:24 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.network.neutron [req-abc8e3a3-71bb-49b7-ad97-7ccea3a51053 req-3b54f8ee-09b9-4d6c-bc08-2259ff5a97a1 service nova] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Updated VIF entry in instance network info cache for port a5802bfe-63ff-426d-b2af-149698ad1a21. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.network.neutron [req-abc8e3a3-71bb-49b7-ad97-7ccea3a51053 req-3b54f8ee-09b9-4d6c-bc08-2259ff5a97a1 service nova] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Updating instance_info_cache with network_info: [{"id": "a5802bfe-63ff-426d-b2af-149698ad1a21", "address": "fa:16:3e:fe:c4:80", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5802bfe-63", "ovs_interfaceid": "a5802bfe-63ff-426d-b2af-149698ad1a21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-abc8e3a3-71bb-49b7-ad97-7ccea3a51053 req-3b54f8ee-09b9-4d6c-bc08-2259ff5a97a1 service nova] Releasing lock "refresh_cache-3d5b6824-153d-43ab-b274-38d733da2664" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:41:25 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] VM Resumed (Lifecycle Event) Apr 20 10:41:25 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Instance spawned successfully. Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:41:25 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:41:25 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] VM Started (Lifecycle Event) Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:41:25 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:41:25 user nova-compute[71283]: INFO nova.compute.manager [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Took 6.13 seconds to spawn the instance on the hypervisor. Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.compute.manager [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.compute.manager [req-b2f837e9-8ca2-4ab0-a65e-63c8c211b55e req-96ed4998-b63f-4c25-b87a-89196f4f5b57 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Received event network-vif-plugged-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b2f837e9-8ca2-4ab0-a65e-63c8c211b55e req-96ed4998-b63f-4c25-b87a-89196f4f5b57 service nova] Acquiring lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b2f837e9-8ca2-4ab0-a65e-63c8c211b55e req-96ed4998-b63f-4c25-b87a-89196f4f5b57 service nova] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b2f837e9-8ca2-4ab0-a65e-63c8c211b55e req-96ed4998-b63f-4c25-b87a-89196f4f5b57 service nova] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.compute.manager [req-b2f837e9-8ca2-4ab0-a65e-63c8c211b55e req-96ed4998-b63f-4c25-b87a-89196f4f5b57 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] No waiting events found dispatching network-vif-plugged-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:41:25 user nova-compute[71283]: WARNING nova.compute.manager [req-b2f837e9-8ca2-4ab0-a65e-63c8c211b55e req-96ed4998-b63f-4c25-b87a-89196f4f5b57 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Received unexpected event network-vif-plugged-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 for instance with vm_state building and task_state spawning. Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.network.neutron [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:41:25 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:41:25 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:41:25 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9059MB free_disk=26.4803466796875GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:41:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:41:25 user nova-compute[71283]: INFO nova.compute.manager [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 3d5b6824-153d-43ab-b274-38d733da2664] Took 1.12 seconds to deallocate network for instance. Apr 20 10:41:26 user nova-compute[71283]: INFO nova.compute.manager [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Took 6.74 seconds to build instance. Apr 20 10:41:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-d0aca3a8-29ea-4a08-9edd-2ac46ec37e02 tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.834s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:41:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 4277833d-7119-4772-ba6d-dfa7368a652b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:41:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:41:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 3d5b6824-153d-43ab-b274-38d733da2664 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} Apr 20 10:41:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:41:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:41:26 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Deleted allocations for instance 3d5b6824-153d-43ab-b274-38d733da2664 Apr 20 10:41:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-ee821388-a4c2-4c00-a9d3-5a2958a81b94 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "3d5b6824-153d-43ab-b274-38d733da2664" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 3.852s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:41:26 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:41:26 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:41:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:41:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:41:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:41:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:41:27 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:41:28 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:41:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:30 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:41:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:41:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Didn't find any instances for network info cache update. {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 10:41:31 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:41:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:41:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:41:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:41:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:41:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:41:56 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:42:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:42:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:42:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:42:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:42:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:42:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:42:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:42:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:42:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "f950a703-4df1-432a-804c-a65807673d01" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:42:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "f950a703-4df1-432a-804c-a65807673d01" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:42:13 user nova-compute[71283]: DEBUG nova.compute.manager [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:42:14 user nova-compute[71283]: INFO nova.compute.claims [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Claim successful on node user Apr 20 10:42:14 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.253s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG nova.network.neutron [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:42:14 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:42:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG nova.policy [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1b7d729afb5942b8a1753ffaa4d5a268', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5bcb8e3930844068bd2b496914a4d764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:42:14 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Creating image(s) Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "/opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "/opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "/opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.145s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk 1073741824" returned: 0 in 0.047s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.188s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:42:14 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG nova.network.neutron [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Successfully created port: 441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Cannot resize image /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG nova.objects.instance [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lazy-loading 'migration_context' on Instance uuid f950a703-4df1-432a-804c-a65807673d01 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Ensure instance console log exists: /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG nova.network.neutron [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Successfully updated port: 441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "refresh_cache-f950a703-4df1-432a-804c-a65807673d01" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquired lock "refresh_cache-f950a703-4df1-432a-804c-a65807673d01" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG nova.network.neutron [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG nova.compute.manager [req-2790b7e3-a0cd-42ee-b36a-c8f350f82b60 req-44fe5e1f-94da-4f3d-b460-afa5bc212b15 service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Received event network-changed-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG nova.compute.manager [req-2790b7e3-a0cd-42ee-b36a-c8f350f82b60 req-44fe5e1f-94da-4f3d-b460-afa5bc212b15 service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Refreshing instance network info cache due to event network-changed-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-2790b7e3-a0cd-42ee-b36a-c8f350f82b60 req-44fe5e1f-94da-4f3d-b460-afa5bc212b15 service nova] Acquiring lock "refresh_cache-f950a703-4df1-432a-804c-a65807673d01" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:42:15 user nova-compute[71283]: DEBUG nova.network.neutron [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.network.neutron [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Updating instance_info_cache with network_info: [{"id": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "address": "fa:16:3e:3d:bd:3a", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap441b9a7d-3a", "ovs_interfaceid": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Releasing lock "refresh_cache-f950a703-4df1-432a-804c-a65807673d01" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.compute.manager [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Instance network_info: |[{"id": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "address": "fa:16:3e:3d:bd:3a", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap441b9a7d-3a", "ovs_interfaceid": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-2790b7e3-a0cd-42ee-b36a-c8f350f82b60 req-44fe5e1f-94da-4f3d-b460-afa5bc212b15 service nova] Acquired lock "refresh_cache-f950a703-4df1-432a-804c-a65807673d01" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.network.neutron [req-2790b7e3-a0cd-42ee-b36a-c8f350f82b60 req-44fe5e1f-94da-4f3d-b460-afa5bc212b15 service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Refreshing network info cache for port 441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Start _get_guest_xml network_info=[{"id": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "address": "fa:16:3e:3d:bd:3a", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap441b9a7d-3a", "ovs_interfaceid": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:42:16 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:42:16 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:42:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1001420332',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1001420332',id=21,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5bcb8e3930844068bd2b496914a4d764',ramdisk_id='',reservation_id='r-whoq06s6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-134694710',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:42:14Z,user_data=None,user_id='1b7d729afb5942b8a1753ffaa4d5a268',uuid=f950a703-4df1-432a-804c-a65807673d01,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "address": "fa:16:3e:3d:bd:3a", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap441b9a7d-3a", "ovs_interfaceid": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converting VIF {"id": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "address": "fa:16:3e:3d:bd:3a", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap441b9a7d-3a", "ovs_interfaceid": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:bd:3a,bridge_name='br-int',has_traffic_filtering=True,id=441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap441b9a7d-3a') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.objects.instance [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lazy-loading 'pci_devices' on Instance uuid f950a703-4df1-432a-804c-a65807673d01 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] End _get_guest_xml xml= Apr 20 10:42:16 user nova-compute[71283]: f950a703-4df1-432a-804c-a65807673d01 Apr 20 10:42:16 user nova-compute[71283]: instance-00000015 Apr 20 10:42:16 user nova-compute[71283]: 131072 Apr 20 10:42:16 user nova-compute[71283]: 1 Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: tempest-ServerBootFromVolumeStableRescueTest-server-1001420332 Apr 20 10:42:16 user nova-compute[71283]: 2023-04-20 10:42:16 Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: 128 Apr 20 10:42:16 user nova-compute[71283]: 1 Apr 20 10:42:16 user nova-compute[71283]: 0 Apr 20 10:42:16 user nova-compute[71283]: 0 Apr 20 10:42:16 user nova-compute[71283]: 1 Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member Apr 20 10:42:16 user nova-compute[71283]: tempest-ServerBootFromVolumeStableRescueTest-134694710 Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: OpenStack Foundation Apr 20 10:42:16 user nova-compute[71283]: OpenStack Nova Apr 20 10:42:16 user nova-compute[71283]: 0.0.0 Apr 20 10:42:16 user nova-compute[71283]: f950a703-4df1-432a-804c-a65807673d01 Apr 20 10:42:16 user nova-compute[71283]: f950a703-4df1-432a-804c-a65807673d01 Apr 20 10:42:16 user nova-compute[71283]: Virtual Machine Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: hvm Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Nehalem Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: /dev/urandom Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: Apr 20 10:42:16 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:42:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1001420332',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1001420332',id=21,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5bcb8e3930844068bd2b496914a4d764',ramdisk_id='',reservation_id='r-whoq06s6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-134694710',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:42:14Z,user_data=None,user_id='1b7d729afb5942b8a1753ffaa4d5a268',uuid=f950a703-4df1-432a-804c-a65807673d01,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "address": "fa:16:3e:3d:bd:3a", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap441b9a7d-3a", "ovs_interfaceid": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converting VIF {"id": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "address": "fa:16:3e:3d:bd:3a", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap441b9a7d-3a", "ovs_interfaceid": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:bd:3a,bridge_name='br-int',has_traffic_filtering=True,id=441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap441b9a7d-3a') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG os_vif [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:bd:3a,bridge_name='br-int',has_traffic_filtering=True,id=441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap441b9a7d-3a') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap441b9a7d-3a, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap441b9a7d-3a, col_values=(('external_ids', {'iface-id': '441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3d:bd:3a', 'vm-uuid': 'f950a703-4df1-432a-804c-a65807673d01'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:16 user nova-compute[71283]: INFO os_vif [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:bd:3a,bridge_name='br-int',has_traffic_filtering=True,id=441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap441b9a7d-3a') Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] No VIF found with MAC fa:16:3e:3d:bd:3a, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.network.neutron [req-2790b7e3-a0cd-42ee-b36a-c8f350f82b60 req-44fe5e1f-94da-4f3d-b460-afa5bc212b15 service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Updated VIF entry in instance network info cache for port 441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG nova.network.neutron [req-2790b7e3-a0cd-42ee-b36a-c8f350f82b60 req-44fe5e1f-94da-4f3d-b460-afa5bc212b15 service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Updating instance_info_cache with network_info: [{"id": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "address": "fa:16:3e:3d:bd:3a", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap441b9a7d-3a", "ovs_interfaceid": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:42:16 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-2790b7e3-a0cd-42ee-b36a-c8f350f82b60 req-44fe5e1f-94da-4f3d-b460-afa5bc212b15 service nova] Releasing lock "refresh_cache-f950a703-4df1-432a-804c-a65807673d01" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:42:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:17 user nova-compute[71283]: DEBUG nova.compute.manager [req-900a9a8a-6211-42c4-9d1f-e1f1e27ebf2f req-c0dae935-4532-435c-8199-2b14da1da2b5 service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Received event network-vif-plugged-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:42:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-900a9a8a-6211-42c4-9d1f-e1f1e27ebf2f req-c0dae935-4532-435c-8199-2b14da1da2b5 service nova] Acquiring lock "f950a703-4df1-432a-804c-a65807673d01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:42:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-900a9a8a-6211-42c4-9d1f-e1f1e27ebf2f req-c0dae935-4532-435c-8199-2b14da1da2b5 service nova] Lock "f950a703-4df1-432a-804c-a65807673d01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:42:17 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-900a9a8a-6211-42c4-9d1f-e1f1e27ebf2f req-c0dae935-4532-435c-8199-2b14da1da2b5 service nova] Lock "f950a703-4df1-432a-804c-a65807673d01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:42:17 user nova-compute[71283]: DEBUG nova.compute.manager [req-900a9a8a-6211-42c4-9d1f-e1f1e27ebf2f req-c0dae935-4532-435c-8199-2b14da1da2b5 service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] No waiting events found dispatching network-vif-plugged-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:42:17 user nova-compute[71283]: WARNING nova.compute.manager [req-900a9a8a-6211-42c4-9d1f-e1f1e27ebf2f req-c0dae935-4532-435c-8199-2b14da1da2b5 service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Received unexpected event network-vif-plugged-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a for instance with vm_state building and task_state spawning. Apr 20 10:42:18 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:42:19 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f950a703-4df1-432a-804c-a65807673d01] VM Resumed (Lifecycle Event) Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.compute.manager [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:42:19 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: f950a703-4df1-432a-804c-a65807673d01] Instance spawned successfully. Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f950a703-4df1-432a-804c-a65807673d01] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f950a703-4df1-432a-804c-a65807673d01] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:42:19 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f950a703-4df1-432a-804c-a65807673d01] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:42:19 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f950a703-4df1-432a-804c-a65807673d01] VM Started (Lifecycle Event) Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f950a703-4df1-432a-804c-a65807673d01] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f950a703-4df1-432a-804c-a65807673d01] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:42:19 user nova-compute[71283]: INFO nova.compute.manager [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Took 5.32 seconds to spawn the instance on the hypervisor. Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.compute.manager [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:42:19 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: f950a703-4df1-432a-804c-a65807673d01] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.compute.manager [req-54a1b97d-4020-4c72-8997-62f2aca90d59 req-26a8b23a-f4e4-4631-9e82-85b7a2a583ef service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Received event network-vif-plugged-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-54a1b97d-4020-4c72-8997-62f2aca90d59 req-26a8b23a-f4e4-4631-9e82-85b7a2a583ef service nova] Acquiring lock "f950a703-4df1-432a-804c-a65807673d01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-54a1b97d-4020-4c72-8997-62f2aca90d59 req-26a8b23a-f4e4-4631-9e82-85b7a2a583ef service nova] Lock "f950a703-4df1-432a-804c-a65807673d01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-54a1b97d-4020-4c72-8997-62f2aca90d59 req-26a8b23a-f4e4-4631-9e82-85b7a2a583ef service nova] Lock "f950a703-4df1-432a-804c-a65807673d01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:42:19 user nova-compute[71283]: DEBUG nova.compute.manager [req-54a1b97d-4020-4c72-8997-62f2aca90d59 req-26a8b23a-f4e4-4631-9e82-85b7a2a583ef service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] No waiting events found dispatching network-vif-plugged-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:42:19 user nova-compute[71283]: WARNING nova.compute.manager [req-54a1b97d-4020-4c72-8997-62f2aca90d59 req-26a8b23a-f4e4-4631-9e82-85b7a2a583ef service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Received unexpected event network-vif-plugged-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a for instance with vm_state building and task_state spawning. Apr 20 10:42:19 user nova-compute[71283]: INFO nova.compute.manager [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Took 5.87 seconds to build instance. Apr 20 10:42:19 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-465a2883-fcae-48bb-af30-c713df9ebaea tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "f950a703-4df1-432a-804c-a65807673d01" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 5.961s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:42:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:42:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:42:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:42:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:42:24 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:42:24 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:42:24 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:42:24 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:42:24 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:42:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:42:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:42:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:42:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:42:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:42:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:42:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:42:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:42:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:42:26 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:42:26 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:42:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=8909MB free_disk=26.458602905273438GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:42:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:42:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:42:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 4277833d-7119-4772-ba6d-dfa7368a652b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:42:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:42:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance f950a703-4df1-432a-804c-a65807673d01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:42:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:42:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:42:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:42:26 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:42:26 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:42:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:42:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.271s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:42:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:42:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:42:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:42:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:42:27 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:42:28 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:42:31 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:42:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:42:31 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:42:31 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:42:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:42:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:42:31 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:42:31 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid 4277833d-7119-4772-ba6d-dfa7368a652b {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:42:32 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Updating instance_info_cache with network_info: [{"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:42:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:42:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:42:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:36 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:42:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:42:56 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:43:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:43:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:10 user nova-compute[71283]: DEBUG nova.compute.manager [req-16b23295-55ec-4fcf-bf05-28f5ae9d4c84 req-9aa6d9ac-44e5-4ad8-a462-47f39b768151 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Received event network-changed-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:43:10 user nova-compute[71283]: DEBUG nova.compute.manager [req-16b23295-55ec-4fcf-bf05-28f5ae9d4c84 req-9aa6d9ac-44e5-4ad8-a462-47f39b768151 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Refreshing instance network info cache due to event network-changed-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:43:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-16b23295-55ec-4fcf-bf05-28f5ae9d4c84 req-9aa6d9ac-44e5-4ad8-a462-47f39b768151 service nova] Acquiring lock "refresh_cache-353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:43:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-16b23295-55ec-4fcf-bf05-28f5ae9d4c84 req-9aa6d9ac-44e5-4ad8-a462-47f39b768151 service nova] Acquired lock "refresh_cache-353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:43:10 user nova-compute[71283]: DEBUG nova.network.neutron [req-16b23295-55ec-4fcf-bf05-28f5ae9d4c84 req-9aa6d9ac-44e5-4ad8-a462-47f39b768151 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Refreshing network info cache for port 1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:43:10 user nova-compute[71283]: DEBUG nova.network.neutron [req-16b23295-55ec-4fcf-bf05-28f5ae9d4c84 req-9aa6d9ac-44e5-4ad8-a462-47f39b768151 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Updated VIF entry in instance network info cache for port 1cf8ddec-3f1f-4db3-9dfc-fb761edd6929. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:43:10 user nova-compute[71283]: DEBUG nova.network.neutron [req-16b23295-55ec-4fcf-bf05-28f5ae9d4c84 req-9aa6d9ac-44e5-4ad8-a462-47f39b768151 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Updating instance_info_cache with network_info: [{"id": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "address": "fa:16:3e:2b:49:99", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cf8ddec-3f", "ovs_interfaceid": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:43:10 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-16b23295-55ec-4fcf-bf05-28f5ae9d4c84 req-9aa6d9ac-44e5-4ad8-a462-47f39b768151 service nova] Releasing lock "refresh_cache-353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:43:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:43:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:43:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:43:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:43:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:43:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:43:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:43:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:43:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:43:11 user nova-compute[71283]: INFO nova.compute.manager [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Terminating instance Apr 20 10:43:11 user nova-compute[71283]: DEBUG nova.compute.manager [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG nova.compute.manager [req-56343266-5a0e-452c-8c3f-9fdf1c288e28 req-8e46bc6e-749c-41b3-9f42-65479b801332 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Received event network-vif-unplugged-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-56343266-5a0e-452c-8c3f-9fdf1c288e28 req-8e46bc6e-749c-41b3-9f42-65479b801332 service nova] Acquiring lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-56343266-5a0e-452c-8c3f-9fdf1c288e28 req-8e46bc6e-749c-41b3-9f42-65479b801332 service nova] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-56343266-5a0e-452c-8c3f-9fdf1c288e28 req-8e46bc6e-749c-41b3-9f42-65479b801332 service nova] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG nova.compute.manager [req-56343266-5a0e-452c-8c3f-9fdf1c288e28 req-8e46bc6e-749c-41b3-9f42-65479b801332 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] No waiting events found dispatching network-vif-unplugged-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG nova.compute.manager [req-56343266-5a0e-452c-8c3f-9fdf1c288e28 req-8e46bc6e-749c-41b3-9f42-65479b801332 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Received event network-vif-unplugged-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:12 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Instance destroyed successfully. Apr 20 10:43:12 user nova-compute[71283]: DEBUG nova.objects.instance [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lazy-loading 'resources' on Instance uuid 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:41:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1678979542',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1678979542',id=19,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNGhoArKgEyYE7cjbkuljf5dfqWvULtvRqXeDkd3yU0VFytwA0oQTi1WmPBfETn5tfNR5w6ja8z1Fe252sgoGIglYuGRFBV1xpFytKMJ6xdB6Euq8vN0T70qYh0FRstlFQ==',key_name='tempest-keypair-1124108181',keypairs=,launch_index=0,launched_at=2023-04-20T10:41:25Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='cb2d81481a7d4ec1b28e5377055b3ed7',ramdisk_id='',reservation_id='r-t0ge5vja',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1878611374',owner_user_name='tempest-AttachVolumeShelveTestJSON-1878611374-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:41:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a931240713e465197b96fa84574ba23',uuid=353f8fcd-6f6e-43e8-b46c-0258d7a6a17b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "address": "fa:16:3e:2b:49:99", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cf8ddec-3f", "ovs_interfaceid": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Converting VIF {"id": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "address": "fa:16:3e:2b:49:99", "network": {"id": "e38408cb-95b3-47d6-8d3c-32650adda382", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1326937209-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb2d81481a7d4ec1b28e5377055b3ed7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cf8ddec-3f", "ovs_interfaceid": "1cf8ddec-3f1f-4db3-9dfc-fb761edd6929", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2b:49:99,bridge_name='br-int',has_traffic_filtering=True,id=1cf8ddec-3f1f-4db3-9dfc-fb761edd6929,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cf8ddec-3f') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG os_vif [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:49:99,bridge_name='br-int',has_traffic_filtering=True,id=1cf8ddec-3f1f-4db3-9dfc-fb761edd6929,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cf8ddec-3f') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1cf8ddec-3f, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:43:12 user nova-compute[71283]: INFO os_vif [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:49:99,bridge_name='br-int',has_traffic_filtering=True,id=1cf8ddec-3f1f-4db3-9dfc-fb761edd6929,network=Network(e38408cb-95b3-47d6-8d3c-32650adda382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cf8ddec-3f') Apr 20 10:43:12 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Deleting instance files /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b_del Apr 20 10:43:12 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Deletion of /opt/stack/data/nova/instances/353f8fcd-6f6e-43e8-b46c-0258d7a6a17b_del complete Apr 20 10:43:12 user nova-compute[71283]: INFO nova.compute.manager [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Took 0.69 seconds to destroy the instance on the hypervisor. Apr 20 10:43:12 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:43:12 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:43:13 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:43:13 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Took 0.85 seconds to deallocate network for instance. Apr 20 10:43:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:43:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:43:13 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:43:13 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:43:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.160s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:43:13 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Deleted allocations for instance 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b Apr 20 10:43:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-32521e78-cdae-400f-a677-234d1f97ed7f tempest-AttachVolumeShelveTestJSON-1878611374 tempest-AttachVolumeShelveTestJSON-1878611374-project-member] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.885s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:43:14 user nova-compute[71283]: DEBUG nova.compute.manager [req-3c932497-aa44-41a2-8cc6-2b40da826719 req-9b08212d-259a-44d2-b625-68453da94554 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Received event network-vif-plugged-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:43:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-3c932497-aa44-41a2-8cc6-2b40da826719 req-9b08212d-259a-44d2-b625-68453da94554 service nova] Acquiring lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:43:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-3c932497-aa44-41a2-8cc6-2b40da826719 req-9b08212d-259a-44d2-b625-68453da94554 service nova] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:43:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-3c932497-aa44-41a2-8cc6-2b40da826719 req-9b08212d-259a-44d2-b625-68453da94554 service nova] Lock "353f8fcd-6f6e-43e8-b46c-0258d7a6a17b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:43:14 user nova-compute[71283]: DEBUG nova.compute.manager [req-3c932497-aa44-41a2-8cc6-2b40da826719 req-9b08212d-259a-44d2-b625-68453da94554 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] No waiting events found dispatching network-vif-plugged-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:43:14 user nova-compute[71283]: WARNING nova.compute.manager [req-3c932497-aa44-41a2-8cc6-2b40da826719 req-9b08212d-259a-44d2-b625-68453da94554 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Received unexpected event network-vif-plugged-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 for instance with vm_state deleted and task_state None. Apr 20 10:43:14 user nova-compute[71283]: DEBUG nova.compute.manager [req-3c932497-aa44-41a2-8cc6-2b40da826719 req-9b08212d-259a-44d2-b625-68453da94554 service nova] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Received event network-vif-deleted-1cf8ddec-3f1f-4db3-9dfc-fb761edd6929 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:43:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:20 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:43:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:43:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:43:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:43:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:43:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:23 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:43:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:25 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:43:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:43:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:43:25 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:43:25 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:43:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:43:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:43:25 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:43:26 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:43:26 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:43:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9048MB free_disk=26.449966430664062GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 4277833d-7119-4772-ba6d-dfa7368a652b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance f950a703-4df1-432a-804c-a65807673d01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:43:26 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.211s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:43:27 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:43:27 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] VM Stopped (Lifecycle Event) Apr 20 10:43:27 user nova-compute[71283]: DEBUG nova.compute.manager [None req-deb665df-c98a-4a21-a88e-3e2a363b1050 None None] [instance: 353f8fcd-6f6e-43e8-b46c-0258d7a6a17b] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:43:27 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:43:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:43:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:43:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:43:27 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:43:30 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:43:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:32 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:43:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:43:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-f950a703-4df1-432a-804c-a65807673d01" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:43:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-f950a703-4df1-432a-804c-a65807673d01" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:43:32 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: f950a703-4df1-432a-804c-a65807673d01] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:43:33 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: f950a703-4df1-432a-804c-a65807673d01] Updating instance_info_cache with network_info: [{"id": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "address": "fa:16:3e:3d:bd:3a", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap441b9a7d-3a", "ovs_interfaceid": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:43:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-f950a703-4df1-432a-804c-a65807673d01" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:43:33 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: f950a703-4df1-432a-804c-a65807673d01] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:43:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:43:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:42 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:47 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:43:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:43:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4994-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:43:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:43:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:43:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:43:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:43:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:44:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:44:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:44:04 user nova-compute[71283]: DEBUG nova.compute.manager [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:44:05 user nova-compute[71283]: INFO nova.compute.manager [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] instance snapshotting Apr 20 10:44:05 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Beginning live snapshot process Apr 20 10:44:05 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json -f qcow2 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:44:05 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json -f qcow2" returned: 0 in 0.134s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:44:05 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json -f qcow2 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:44:05 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json -f qcow2" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:44:05 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:44:05 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.128s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:44:05 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp9ircy8b_/b7f071e981f64ef3bd07202f39550c6c.delta 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:44:05 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:44:05 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp9ircy8b_/b7f071e981f64ef3bd07202f39550c6c.delta 1073741824" returned: 0 in 0.054s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:44:05 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Quiescing instance not available: QEMU guest agent is not enabled. Apr 20 10:44:06 user nova-compute[71283]: DEBUG nova.virt.libvirt.guest [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71283) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 20 10:44:06 user nova-compute[71283]: DEBUG nova.virt.libvirt.guest [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71283) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 20 10:44:06 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 20 10:44:07 user nova-compute[71283]: DEBUG nova.privsep.utils [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71283) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 10:44:07 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp9ircy8b_/b7f071e981f64ef3bd07202f39550c6c.delta /opt/stack/data/nova/instances/snapshots/tmp9ircy8b_/b7f071e981f64ef3bd07202f39550c6c {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:44:07 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp9ircy8b_/b7f071e981f64ef3bd07202f39550c6c.delta /opt/stack/data/nova/instances/snapshots/tmp9ircy8b_/b7f071e981f64ef3bd07202f39550c6c" returned: 0 in 0.451s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:44:07 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Snapshot extracted, beginning image upload Apr 20 10:44:07 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:44:09 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Snapshot image upload complete Apr 20 10:44:09 user nova-compute[71283]: INFO nova.compute.manager [None req-0239579b-05ff-484e-b8ad-8089da2f734a tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Took 4.72 seconds to snapshot the instance on the hypervisor. Apr 20 10:44:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:44:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:44:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:21 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 20 10:44:21 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] There are 0 instances to clean {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 20 10:44:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:44:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:26 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:26 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:44:27 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:44:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:44:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:44:27 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:44:27 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:44:27 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:44:27 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:44:27 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:44:28 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:44:28 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:44:28 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:44:28 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:44:28 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:44:28 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:44:28 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:44:28 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9017MB free_disk=26.411216735839844GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:44:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:44:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:44:28 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 4277833d-7119-4772-ba6d-dfa7368a652b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:44:28 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance f950a703-4df1-432a-804c-a65807673d01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:44:28 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:44:28 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:44:28 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:44:29 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:44:29 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:44:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:44:30 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:31 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances with incomplete migration {{(pid=71283) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 20 10:44:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:44:32 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:34 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:44:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:44:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:44:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:44:34 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:44:34 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid 4277833d-7119-4772-ba6d-dfa7368a652b {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:44:35 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Updating instance_info_cache with network_info: [{"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:44:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:44:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:44:37 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:44:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:44:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:44:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:44:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:44:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:44:42 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:44:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:44:44 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_power_states {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:44:44 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Triggering sync for uuid 4277833d-7119-4772-ba6d-dfa7368a652b {{(pid=71283) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 10:44:44 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Triggering sync for uuid f950a703-4df1-432a-804c-a65807673d01 {{(pid=71283) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 10:44:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "4277833d-7119-4772-ba6d-dfa7368a652b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:44:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "4277833d-7119-4772-ba6d-dfa7368a652b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:44:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "f950a703-4df1-432a-804c-a65807673d01" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:44:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "f950a703-4df1-432a-804c-a65807673d01" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:44:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "4277833d-7119-4772-ba6d-dfa7368a652b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.026s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:44:44 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "f950a703-4df1-432a-804c-a65807673d01" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.026s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:44:47 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:44:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:44:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:45:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:45:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:45:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:45:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:45:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:45:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:45:07 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:45:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "23e0e8b8-22fc-4ab2-8bff-35e203715439" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:45:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "23e0e8b8-22fc-4ab2-8bff-35e203715439" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:45:11 user nova-compute[71283]: DEBUG nova.compute.manager [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:45:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:45:11 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:45:11 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:45:11 user nova-compute[71283]: INFO nova.compute.claims [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Claim successful on node user Apr 20 10:45:12 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:45:12 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:45:12 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:45:12 user nova-compute[71283]: DEBUG nova.compute.manager [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:45:12 user nova-compute[71283]: DEBUG nova.compute.manager [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:45:12 user nova-compute[71283]: DEBUG nova.network.neutron [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:45:12 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:45:12 user nova-compute[71283]: DEBUG nova.compute.manager [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:45:12 user nova-compute[71283]: DEBUG nova.policy [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1b7d729afb5942b8a1753ffaa4d5a268', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5bcb8e3930844068bd2b496914a4d764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:45:12 user nova-compute[71283]: INFO nova.virt.block_device [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Booting with volume-backed-image 3687b278-c2bc-46f2-9b7e-579c8d06fe41 at /dev/vda Apr 20 10:45:12 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:45:12 user nova-compute[71283]: WARNING nova.compute.manager [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Volume id: 3528c135-6b49-4596-ab3a-85ad549ebbbf finished being created but its status is error. Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume 3528c135-6b49-4596-ab3a-85ad549ebbbf did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Traceback (most recent call last): Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] driver_block_device.attach_block_devices( Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] _log_and_attach(device) Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] bdm.attach(*attach_args, **attach_kwargs) Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] File "/opt/stack/nova/nova/virt/block_device.py", line 831, in attach Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] self.volume_id, self.attachment_id = self._create_volume( Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] with excutils.save_and_reraise_exception(): Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] self.force_reraise() Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] raise self.value Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] wait_func(context, volume_id) Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] nova.exception.VolumeNotCreated: Volume 3528c135-6b49-4596-ab3a-85ad549ebbbf did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 20 10:45:12 user nova-compute[71283]: ERROR nova.compute.manager [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Apr 20 10:45:13 user nova-compute[71283]: DEBUG nova.network.neutron [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Successfully created port: e5f8609e-9e75-41d5-901a-ed44b394d745 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:45:13 user nova-compute[71283]: DEBUG nova.network.neutron [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Successfully updated port: e5f8609e-9e75-41d5-901a-ed44b394d745 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:45:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "refresh_cache-23e0e8b8-22fc-4ab2-8bff-35e203715439" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:45:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquired lock "refresh_cache-23e0e8b8-22fc-4ab2-8bff-35e203715439" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:45:13 user nova-compute[71283]: DEBUG nova.network.neutron [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:45:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-fa0eb5a7-dee9-46de-8a81-30882b7ba985 req-589616ef-f422-4790-b404-c65ae1ce6151 service nova] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Received event network-changed-e5f8609e-9e75-41d5-901a-ed44b394d745 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:45:13 user nova-compute[71283]: DEBUG nova.compute.manager [req-fa0eb5a7-dee9-46de-8a81-30882b7ba985 req-589616ef-f422-4790-b404-c65ae1ce6151 service nova] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Refreshing instance network info cache due to event network-changed-e5f8609e-9e75-41d5-901a-ed44b394d745. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:45:13 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-fa0eb5a7-dee9-46de-8a81-30882b7ba985 req-589616ef-f422-4790-b404-c65ae1ce6151 service nova] Acquiring lock "refresh_cache-23e0e8b8-22fc-4ab2-8bff-35e203715439" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:45:13 user nova-compute[71283]: DEBUG nova.network.neutron [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.network.neutron [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Updating instance_info_cache with network_info: [{"id": "e5f8609e-9e75-41d5-901a-ed44b394d745", "address": "fa:16:3e:83:0c:f6", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5f8609e-9e", "ovs_interfaceid": "e5f8609e-9e75-41d5-901a-ed44b394d745", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Releasing lock "refresh_cache-23e0e8b8-22fc-4ab2-8bff-35e203715439" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Instance network_info: |[{"id": "e5f8609e-9e75-41d5-901a-ed44b394d745", "address": "fa:16:3e:83:0c:f6", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5f8609e-9e", "ovs_interfaceid": "e5f8609e-9e75-41d5-901a-ed44b394d745", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-fa0eb5a7-dee9-46de-8a81-30882b7ba985 req-589616ef-f422-4790-b404-c65ae1ce6151 service nova] Acquired lock "refresh_cache-23e0e8b8-22fc-4ab2-8bff-35e203715439" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.network.neutron [req-fa0eb5a7-dee9-46de-8a81-30882b7ba985 req-589616ef-f422-4790-b404-c65ae1ce6151 service nova] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Refreshing network info cache for port e5f8609e-9e75-41d5-901a-ed44b394d745 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.compute.claims [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Aborting claim: {{(pid=71283) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.197s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Build of instance 23e0e8b8-22fc-4ab2-8bff-35e203715439 aborted: Volume 3528c135-6b49-4596-ab3a-85ad549ebbbf did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.compute.utils [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Build of instance 23e0e8b8-22fc-4ab2-8bff-35e203715439 aborted: Volume 3528c135-6b49-4596-ab3a-85ad549ebbbf did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71283) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 20 10:45:14 user nova-compute[71283]: ERROR nova.compute.manager [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Build of instance 23e0e8b8-22fc-4ab2-8bff-35e203715439 aborted: Volume 3528c135-6b49-4596-ab3a-85ad549ebbbf did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance 23e0e8b8-22fc-4ab2-8bff-35e203715439 aborted: Volume 3528c135-6b49-4596-ab3a-85ad549ebbbf did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Unplugging VIFs for instance {{(pid=71283) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:45:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-642328482',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-642328482',id=22,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5bcb8e3930844068bd2b496914a4d764',ramdisk_id='',reservation_id='r-f0yysi91',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-134694710',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:45:12Z,user_data=None,user_id='1b7d729afb5942b8a1753ffaa4d5a268',uuid=23e0e8b8-22fc-4ab2-8bff-35e203715439,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e5f8609e-9e75-41d5-901a-ed44b394d745", "address": "fa:16:3e:83:0c:f6", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5f8609e-9e", "ovs_interfaceid": "e5f8609e-9e75-41d5-901a-ed44b394d745", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converting VIF {"id": "e5f8609e-9e75-41d5-901a-ed44b394d745", "address": "fa:16:3e:83:0c:f6", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5f8609e-9e", "ovs_interfaceid": "e5f8609e-9e75-41d5-901a-ed44b394d745", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:0c:f6,bridge_name='br-int',has_traffic_filtering=True,id=e5f8609e-9e75-41d5-901a-ed44b394d745,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5f8609e-9e') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG os_vif [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:0c:f6,bridge_name='br-int',has_traffic_filtering=True,id=e5f8609e-9e75-41d5-901a-ed44b394d745,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5f8609e-9e') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5f8609e-9e, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:45:14 user nova-compute[71283]: INFO os_vif [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:0c:f6,bridge_name='br-int',has_traffic_filtering=True,id=e5f8609e-9e75-41d5-901a-ed44b394d745,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5f8609e-9e') Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Unplugged VIFs for instance {{(pid=71283) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.compute.manager [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.network.neutron [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.network.neutron [req-fa0eb5a7-dee9-46de-8a81-30882b7ba985 req-589616ef-f422-4790-b404-c65ae1ce6151 service nova] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Updated VIF entry in instance network info cache for port e5f8609e-9e75-41d5-901a-ed44b394d745. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG nova.network.neutron [req-fa0eb5a7-dee9-46de-8a81-30882b7ba985 req-589616ef-f422-4790-b404-c65ae1ce6151 service nova] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Updating instance_info_cache with network_info: [{"id": "e5f8609e-9e75-41d5-901a-ed44b394d745", "address": "fa:16:3e:83:0c:f6", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5f8609e-9e", "ovs_interfaceid": "e5f8609e-9e75-41d5-901a-ed44b394d745", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:45:14 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-fa0eb5a7-dee9-46de-8a81-30882b7ba985 req-589616ef-f422-4790-b404-c65ae1ce6151 service nova] Releasing lock "refresh_cache-23e0e8b8-22fc-4ab2-8bff-35e203715439" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:45:15 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:45:15 user nova-compute[71283]: DEBUG nova.network.neutron [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:45:15 user nova-compute[71283]: INFO nova.compute.manager [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 23e0e8b8-22fc-4ab2-8bff-35e203715439] Took 0.54 seconds to deallocate network for instance. Apr 20 10:45:15 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Deleted allocations for instance 23e0e8b8-22fc-4ab2-8bff-35e203715439 Apr 20 10:45:15 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-60d00bd6-937d-4e6f-8953-43915a6c8a6f tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "23e0e8b8-22fc-4ab2-8bff-35e203715439" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 3.382s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:45:17 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:45:22 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:45:22 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:45:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:45:27 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:45:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:45:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:45:27 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:45:28 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:45:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:45:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:45:28 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:45:28 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:45:28 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:45:28 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:45:28 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:45:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:45:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:45:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:45:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:45:29 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:45:29 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:45:29 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:45:29 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9101MB free_disk=26.410053253173828GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:45:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:45:29 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:45:30 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 4277833d-7119-4772-ba6d-dfa7368a652b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:45:30 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance f950a703-4df1-432a-804c-a65807673d01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:45:30 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:45:30 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:45:30 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing inventories for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 20 10:45:30 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating ProviderTree inventory for provider bdbc83bd-9307-4e20-8e3d-430b77499399 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 20 10:45:30 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating inventory in ProviderTree for provider bdbc83bd-9307-4e20-8e3d-430b77499399 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 10:45:30 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing aggregate associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, aggregates: None {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 20 10:45:30 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing trait associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 20 10:45:30 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:45:30 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:45:30 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:45:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.512s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:45:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:45:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:45:32 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:45:34 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:45:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:45:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-f950a703-4df1-432a-804c-a65807673d01" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:45:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-f950a703-4df1-432a-804c-a65807673d01" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:45:34 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: f950a703-4df1-432a-804c-a65807673d01] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:45:35 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: f950a703-4df1-432a-804c-a65807673d01] Updating instance_info_cache with network_info: [{"id": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "address": "fa:16:3e:3d:bd:3a", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap441b9a7d-3a", "ovs_interfaceid": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:45:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-f950a703-4df1-432a-804c-a65807673d01" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:45:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: f950a703-4df1-432a-804c-a65807673d01] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:45:35 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:45:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:45:42 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:45:47 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:45:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:45:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:45:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:45:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:45:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:45:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:45:57 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:46:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "f950a703-4df1-432a-804c-a65807673d01" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:46:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "f950a703-4df1-432a-804c-a65807673d01" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:46:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "f950a703-4df1-432a-804c-a65807673d01-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:46:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "f950a703-4df1-432a-804c-a65807673d01-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:46:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "f950a703-4df1-432a-804c-a65807673d01-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:46:02 user nova-compute[71283]: INFO nova.compute.manager [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Terminating instance Apr 20 10:46:02 user nova-compute[71283]: DEBUG nova.compute.manager [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:46:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG nova.compute.manager [req-5686bdfb-f26b-45d7-b336-4efe3509dc9d req-5e1d7e07-ffc3-41ca-8b44-4f10b5abe379 service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Received event network-vif-unplugged-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-5686bdfb-f26b-45d7-b336-4efe3509dc9d req-5e1d7e07-ffc3-41ca-8b44-4f10b5abe379 service nova] Acquiring lock "f950a703-4df1-432a-804c-a65807673d01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-5686bdfb-f26b-45d7-b336-4efe3509dc9d req-5e1d7e07-ffc3-41ca-8b44-4f10b5abe379 service nova] Lock "f950a703-4df1-432a-804c-a65807673d01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-5686bdfb-f26b-45d7-b336-4efe3509dc9d req-5e1d7e07-ffc3-41ca-8b44-4f10b5abe379 service nova] Lock "f950a703-4df1-432a-804c-a65807673d01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG nova.compute.manager [req-5686bdfb-f26b-45d7-b336-4efe3509dc9d req-5e1d7e07-ffc3-41ca-8b44-4f10b5abe379 service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] No waiting events found dispatching network-vif-unplugged-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG nova.compute.manager [req-5686bdfb-f26b-45d7-b336-4efe3509dc9d req-5e1d7e07-ffc3-41ca-8b44-4f10b5abe379 service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Received event network-vif-unplugged-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:03 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: f950a703-4df1-432a-804c-a65807673d01] Instance destroyed successfully. Apr 20 10:46:03 user nova-compute[71283]: DEBUG nova.objects.instance [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lazy-loading 'resources' on Instance uuid f950a703-4df1-432a-804c-a65807673d01 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:42:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1001420332',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1001420332',id=21,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T10:42:19Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='5bcb8e3930844068bd2b496914a4d764',ramdisk_id='',reservation_id='r-whoq06s6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-134694710',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:44:10Z,user_data=None,user_id='1b7d729afb5942b8a1753ffaa4d5a268',uuid=f950a703-4df1-432a-804c-a65807673d01,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "address": "fa:16:3e:3d:bd:3a", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap441b9a7d-3a", "ovs_interfaceid": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converting VIF {"id": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "address": "fa:16:3e:3d:bd:3a", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap441b9a7d-3a", "ovs_interfaceid": "441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3d:bd:3a,bridge_name='br-int',has_traffic_filtering=True,id=441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap441b9a7d-3a') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG os_vif [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3d:bd:3a,bridge_name='br-int',has_traffic_filtering=True,id=441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap441b9a7d-3a') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap441b9a7d-3a, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:46:03 user nova-compute[71283]: INFO os_vif [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3d:bd:3a,bridge_name='br-int',has_traffic_filtering=True,id=441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap441b9a7d-3a') Apr 20 10:46:03 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Deleting instance files /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01_del Apr 20 10:46:03 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Deletion of /opt/stack/data/nova/instances/f950a703-4df1-432a-804c-a65807673d01_del complete Apr 20 10:46:03 user nova-compute[71283]: INFO nova.compute.manager [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: f950a703-4df1-432a-804c-a65807673d01] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 20 10:46:03 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: f950a703-4df1-432a-804c-a65807673d01] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: f950a703-4df1-432a-804c-a65807673d01] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: f950a703-4df1-432a-804c-a65807673d01] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:46:03 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: f950a703-4df1-432a-804c-a65807673d01] Took 0.44 seconds to deallocate network for instance. Apr 20 10:46:03 user nova-compute[71283]: DEBUG nova.compute.manager [req-04281827-5dcb-4510-bcb6-4c5d4ed11a24 req-ecc6ed3d-4998-40e1-b9a7-453ae528c9cf service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Received event network-vif-deleted-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:46:03 user nova-compute[71283]: INFO nova.compute.manager [req-04281827-5dcb-4510-bcb6-4c5d4ed11a24 req-ecc6ed3d-4998-40e1-b9a7-453ae528c9cf service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Neutron deleted interface 441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a; detaching it from the instance and deleting it from the info cache Apr 20 10:46:03 user nova-compute[71283]: DEBUG nova.network.neutron [req-04281827-5dcb-4510-bcb6-4c5d4ed11a24 req-ecc6ed3d-4998-40e1-b9a7-453ae528c9cf service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG nova.compute.manager [req-04281827-5dcb-4510-bcb6-4c5d4ed11a24 req-ecc6ed3d-4998-40e1-b9a7-453ae528c9cf service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Detach interface failed, port_id=441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a, reason: Instance f950a703-4df1-432a-804c-a65807673d01 could not be found. {{(pid=71283) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:46:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:46:04 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:46:04 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:46:04 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.136s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:46:04 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Deleted allocations for instance f950a703-4df1-432a-804c-a65807673d01 Apr 20 10:46:04 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-849bddc8-6a84-4e66-a395-d20c7ad7ba27 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "f950a703-4df1-432a-804c-a65807673d01" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.416s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:46:05 user nova-compute[71283]: DEBUG nova.compute.manager [req-ef8b083b-46a2-4979-b387-6e4582f1f63c req-03433176-a351-4e9c-ba73-fd84e059406e service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Received event network-vif-plugged-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:46:05 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ef8b083b-46a2-4979-b387-6e4582f1f63c req-03433176-a351-4e9c-ba73-fd84e059406e service nova] Acquiring lock "f950a703-4df1-432a-804c-a65807673d01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:46:05 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ef8b083b-46a2-4979-b387-6e4582f1f63c req-03433176-a351-4e9c-ba73-fd84e059406e service nova] Lock "f950a703-4df1-432a-804c-a65807673d01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:46:05 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ef8b083b-46a2-4979-b387-6e4582f1f63c req-03433176-a351-4e9c-ba73-fd84e059406e service nova] Lock "f950a703-4df1-432a-804c-a65807673d01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:46:05 user nova-compute[71283]: DEBUG nova.compute.manager [req-ef8b083b-46a2-4979-b387-6e4582f1f63c req-03433176-a351-4e9c-ba73-fd84e059406e service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] No waiting events found dispatching network-vif-plugged-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:46:05 user nova-compute[71283]: WARNING nova.compute.manager [req-ef8b083b-46a2-4979-b387-6e4582f1f63c req-03433176-a351-4e9c-ba73-fd84e059406e service nova] [instance: f950a703-4df1-432a-804c-a65807673d01] Received unexpected event network-vif-plugged-441b9a7d-3a3d-486b-9a0e-6cb55fdcbe4a for instance with vm_state deleted and task_state None. Apr 20 10:46:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:46:18 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:46:18 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: f950a703-4df1-432a-804c-a65807673d01] VM Stopped (Lifecycle Event) Apr 20 10:46:18 user nova-compute[71283]: DEBUG nova.compute.manager [None req-b8eb679d-bf7e-4e8b-9690-7d8fd2da5c74 None None] [instance: f950a703-4df1-432a-804c-a65807673d01] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:46:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:46:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:46:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:46:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:28 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:46:28 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:46:29 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:46:29 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:46:30 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:46:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:46:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:46:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:46:30 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:46:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:46:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:46:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:46:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:46:31 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:46:31 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:46:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9147MB free_disk=26.46477508544922GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:46:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:46:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:46:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 4277833d-7119-4772-ba6d-dfa7368a652b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:46:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:46:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:46:31 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:46:31 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:46:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:46:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.188s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:46:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:35 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:46:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:46:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:46:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:46:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:46:35 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:46:35 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid 4277833d-7119-4772-ba6d-dfa7368a652b {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:46:36 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Updating instance_info_cache with network_info: [{"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:46:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-4277833d-7119-4772-ba6d-dfa7368a652b" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:46:36 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:46:36 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:46:37 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:46:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:46:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:46:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "4277833d-7119-4772-ba6d-dfa7368a652b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "4277833d-7119-4772-ba6d-dfa7368a652b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:46:53 user nova-compute[71283]: INFO nova.compute.manager [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Terminating instance Apr 20 10:46:53 user nova-compute[71283]: DEBUG nova.compute.manager [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-f95039a3-2b01-4df6-bf66-35ce1bd22e0b req-4a5cd2ed-1ea6-45b6-aaf0-a847e31cf284 service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Received event network-vif-unplugged-02bf30f9-074c-4a50-a4c8-d468c31067a8 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f95039a3-2b01-4df6-bf66-35ce1bd22e0b req-4a5cd2ed-1ea6-45b6-aaf0-a847e31cf284 service nova] Acquiring lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f95039a3-2b01-4df6-bf66-35ce1bd22e0b req-4a5cd2ed-1ea6-45b6-aaf0-a847e31cf284 service nova] Lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-f95039a3-2b01-4df6-bf66-35ce1bd22e0b req-4a5cd2ed-1ea6-45b6-aaf0-a847e31cf284 service nova] Lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-f95039a3-2b01-4df6-bf66-35ce1bd22e0b req-4a5cd2ed-1ea6-45b6-aaf0-a847e31cf284 service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] No waiting events found dispatching network-vif-unplugged-02bf30f9-074c-4a50-a4c8-d468c31067a8 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:46:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-f95039a3-2b01-4df6-bf66-35ce1bd22e0b req-4a5cd2ed-1ea6-45b6-aaf0-a847e31cf284 service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Received event network-vif-unplugged-02bf30f9-074c-4a50-a4c8-d468c31067a8 for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:54 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Instance destroyed successfully. Apr 20 10:46:54 user nova-compute[71283]: DEBUG nova.objects.instance [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lazy-loading 'resources' on Instance uuid 4277833d-7119-4772-ba6d-dfa7368a652b {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-307242091',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-307242091',id=17,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T10:38:30Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='5bcb8e3930844068bd2b496914a4d764',ramdisk_id='',reservation_id='r-khwltncp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-134694710',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:40:20Z,user_data=None,user_id='1b7d729afb5942b8a1753ffaa4d5a268',uuid=4277833d-7119-4772-ba6d-dfa7368a652b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converting VIF {"id": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "address": "fa:16:3e:ee:07:da", "network": {"id": "48558ac2-005d-42a1-a003-2662a04d6946", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-838813002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5bcb8e3930844068bd2b496914a4d764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap02bf30f9-07", "ovs_interfaceid": "02bf30f9-074c-4a50-a4c8-d468c31067a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:07:da,bridge_name='br-int',has_traffic_filtering=True,id=02bf30f9-074c-4a50-a4c8-d468c31067a8,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02bf30f9-07') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG os_vif [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:07:da,bridge_name='br-int',has_traffic_filtering=True,id=02bf30f9-074c-4a50-a4c8-d468c31067a8,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02bf30f9-07') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02bf30f9-07, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:46:54 user nova-compute[71283]: INFO os_vif [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:07:da,bridge_name='br-int',has_traffic_filtering=True,id=02bf30f9-074c-4a50-a4c8-d468c31067a8,network=Network(48558ac2-005d-42a1-a003-2662a04d6946),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02bf30f9-07') Apr 20 10:46:54 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Deleting instance files /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b_del Apr 20 10:46:54 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Deletion of /opt/stack/data/nova/instances/4277833d-7119-4772-ba6d-dfa7368a652b_del complete Apr 20 10:46:54 user nova-compute[71283]: INFO nova.compute.manager [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Took 0.87 seconds to destroy the instance on the hypervisor. Apr 20 10:46:54 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:46:54 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Took 0.48 seconds to deallocate network for instance. Apr 20 10:46:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:46:54 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:46:55 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:46:55 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Deleted allocations for instance 4277833d-7119-4772-ba6d-dfa7368a652b Apr 20 10:46:55 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2e89cf77-fa34-4c89-9364-7d9f7c7ef9d7 tempest-ServerBootFromVolumeStableRescueTest-134694710 tempest-ServerBootFromVolumeStableRescueTest-134694710-project-member] Lock "4277833d-7119-4772-ba6d-dfa7368a652b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.671s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:46:55 user nova-compute[71283]: DEBUG nova.compute.manager [req-4b669bbc-2cb5-4d7b-86e9-ac95fa26b9ad req-6c892d7b-6ffa-4cf0-b8ba-396f51d48b98 service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Received event network-vif-plugged-02bf30f9-074c-4a50-a4c8-d468c31067a8 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:46:55 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4b669bbc-2cb5-4d7b-86e9-ac95fa26b9ad req-6c892d7b-6ffa-4cf0-b8ba-396f51d48b98 service nova] Acquiring lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:46:55 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4b669bbc-2cb5-4d7b-86e9-ac95fa26b9ad req-6c892d7b-6ffa-4cf0-b8ba-396f51d48b98 service nova] Lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:46:55 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4b669bbc-2cb5-4d7b-86e9-ac95fa26b9ad req-6c892d7b-6ffa-4cf0-b8ba-396f51d48b98 service nova] Lock "4277833d-7119-4772-ba6d-dfa7368a652b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:46:55 user nova-compute[71283]: DEBUG nova.compute.manager [req-4b669bbc-2cb5-4d7b-86e9-ac95fa26b9ad req-6c892d7b-6ffa-4cf0-b8ba-396f51d48b98 service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] No waiting events found dispatching network-vif-plugged-02bf30f9-074c-4a50-a4c8-d468c31067a8 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:46:55 user nova-compute[71283]: WARNING nova.compute.manager [req-4b669bbc-2cb5-4d7b-86e9-ac95fa26b9ad req-6c892d7b-6ffa-4cf0-b8ba-396f51d48b98 service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Received unexpected event network-vif-plugged-02bf30f9-074c-4a50-a4c8-d468c31067a8 for instance with vm_state deleted and task_state None. Apr 20 10:46:55 user nova-compute[71283]: DEBUG nova.compute.manager [req-4b669bbc-2cb5-4d7b-86e9-ac95fa26b9ad req-6c892d7b-6ffa-4cf0-b8ba-396f51d48b98 service nova] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Received event network-vif-deleted-02bf30f9-074c-4a50-a4c8-d468c31067a8 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:46:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:47:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:47:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:47:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:47:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:09 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:47:09 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] VM Stopped (Lifecycle Event) Apr 20 10:47:09 user nova-compute[71283]: DEBUG nova.compute.manager [None req-1a817b8c-c485-4d12-bda4-e4d8ade3f49f None None] [instance: 4277833d-7119-4772-ba6d-dfa7368a652b] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:47:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:47:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:23 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:47:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:47:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:47:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:47:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:47:29 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:47:30 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:47:30 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:47:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:47:30 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:47:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:47:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:47:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:47:30 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:47:31 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:47:31 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:47:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9264MB free_disk=26.519744873046875GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:47:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:47:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:47:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:47:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:47:31 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:47:31 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:47:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:47:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:47:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:47:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:47:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:47:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:47:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:35 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:47:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:47:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:47:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Didn't find any instances for network info cache update. {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 10:47:36 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:47:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:47:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:47:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG nova.compute.manager [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:47:56 user nova-compute[71283]: INFO nova.compute.claims [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Claim successful on node user Apr 20 10:47:56 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG nova.compute.manager [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG nova.compute.manager [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG nova.network.neutron [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:47:56 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:47:56 user nova-compute[71283]: DEBUG nova.compute.manager [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG nova.policy [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b965155fa8a453f9f3ebbf5514bb96f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '938a2ac3aa514d5ea63c1598221790f8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG nova.compute.manager [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:47:56 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Creating image(s) Apr 20 10:47:56 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "/opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "/opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "/opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:47:56 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk 1073741824" returned: 0 in 0.046s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.185s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.127s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Checking if we can resize image /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG nova.network.neutron [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Successfully created port: 5f63e57e-560d-426d-87da-b178c30bbb27 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Cannot resize image /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG nova.objects.instance [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lazy-loading 'migration_context' on Instance uuid 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Ensure instance console log exists: /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG nova.network.neutron [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Successfully updated port: 5f63e57e-560d-426d-87da-b178c30bbb27 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquired lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG nova.network.neutron [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG nova.compute.manager [req-9c7d532f-4af0-4438-b2f8-4a5fdbbcd4b9 req-90bd56c9-ab0d-4cf1-936b-da1cf9040386 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Received event network-changed-5f63e57e-560d-426d-87da-b178c30bbb27 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG nova.compute.manager [req-9c7d532f-4af0-4438-b2f8-4a5fdbbcd4b9 req-90bd56c9-ab0d-4cf1-936b-da1cf9040386 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Refreshing instance network info cache due to event network-changed-5f63e57e-560d-426d-87da-b178c30bbb27. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9c7d532f-4af0-4438-b2f8-4a5fdbbcd4b9 req-90bd56c9-ab0d-4cf1-936b-da1cf9040386 service nova] Acquiring lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:47:57 user nova-compute[71283]: DEBUG nova.network.neutron [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.network.neutron [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updating instance_info_cache with network_info: [{"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Releasing lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.compute.manager [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Instance network_info: |[{"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9c7d532f-4af0-4438-b2f8-4a5fdbbcd4b9 req-90bd56c9-ab0d-4cf1-936b-da1cf9040386 service nova] Acquired lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.network.neutron [req-9c7d532f-4af0-4438-b2f8-4a5fdbbcd4b9 req-90bd56c9-ab0d-4cf1-936b-da1cf9040386 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Refreshing network info cache for port 5f63e57e-560d-426d-87da-b178c30bbb27 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Start _get_guest_xml network_info=[{"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:47:58 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:47:58 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:47:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-504126875',display_name='tempest-ServersNegativeTestJSON-server-504126875',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-504126875',id=23,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='938a2ac3aa514d5ea63c1598221790f8',ramdisk_id='',reservation_id='r-y9va7a3q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-284593456',owner_user_name='tempest-ServersNegativeTestJSON-284593456-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:47:57Z,user_data=None,user_id='8b965155fa8a453f9f3ebbf5514bb96f',uuid=6a0d597a-4eea-4938-92c6-c17d0a6c82e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Converting VIF {"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:1c:e8,bridge_name='br-int',has_traffic_filtering=True,id=5f63e57e-560d-426d-87da-b178c30bbb27,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f63e57e-56') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.objects.instance [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lazy-loading 'pci_devices' on Instance uuid 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] End _get_guest_xml xml= Apr 20 10:47:58 user nova-compute[71283]: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 Apr 20 10:47:58 user nova-compute[71283]: instance-00000017 Apr 20 10:47:58 user nova-compute[71283]: 131072 Apr 20 10:47:58 user nova-compute[71283]: 1 Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: tempest-ServersNegativeTestJSON-server-504126875 Apr 20 10:47:58 user nova-compute[71283]: 2023-04-20 10:47:58 Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: 128 Apr 20 10:47:58 user nova-compute[71283]: 1 Apr 20 10:47:58 user nova-compute[71283]: 0 Apr 20 10:47:58 user nova-compute[71283]: 0 Apr 20 10:47:58 user nova-compute[71283]: 1 Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: tempest-ServersNegativeTestJSON-284593456-project-member Apr 20 10:47:58 user nova-compute[71283]: tempest-ServersNegativeTestJSON-284593456 Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: OpenStack Foundation Apr 20 10:47:58 user nova-compute[71283]: OpenStack Nova Apr 20 10:47:58 user nova-compute[71283]: 0.0.0 Apr 20 10:47:58 user nova-compute[71283]: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 Apr 20 10:47:58 user nova-compute[71283]: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 Apr 20 10:47:58 user nova-compute[71283]: Virtual Machine Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: hvm Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Nehalem Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: /dev/urandom Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: Apr 20 10:47:58 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:47:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-504126875',display_name='tempest-ServersNegativeTestJSON-server-504126875',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-504126875',id=23,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='938a2ac3aa514d5ea63c1598221790f8',ramdisk_id='',reservation_id='r-y9va7a3q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-284593456',owner_user_name='tempest-ServersNegativeTestJSON-284593456-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:47:57Z,user_data=None,user_id='8b965155fa8a453f9f3ebbf5514bb96f',uuid=6a0d597a-4eea-4938-92c6-c17d0a6c82e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Converting VIF {"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:1c:e8,bridge_name='br-int',has_traffic_filtering=True,id=5f63e57e-560d-426d-87da-b178c30bbb27,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f63e57e-56') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG os_vif [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:1c:e8,bridge_name='br-int',has_traffic_filtering=True,id=5f63e57e-560d-426d-87da-b178c30bbb27,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f63e57e-56') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5f63e57e-56, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5f63e57e-56, col_values=(('external_ids', {'iface-id': '5f63e57e-560d-426d-87da-b178c30bbb27', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:1c:e8', 'vm-uuid': '6a0d597a-4eea-4938-92c6-c17d0a6c82e5'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:58 user nova-compute[71283]: INFO os_vif [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:1c:e8,bridge_name='br-int',has_traffic_filtering=True,id=5f63e57e-560d-426d-87da-b178c30bbb27,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f63e57e-56') Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] No VIF found with MAC fa:16:3e:2b:1c:e8, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.network.neutron [req-9c7d532f-4af0-4438-b2f8-4a5fdbbcd4b9 req-90bd56c9-ab0d-4cf1-936b-da1cf9040386 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updated VIF entry in instance network info cache for port 5f63e57e-560d-426d-87da-b178c30bbb27. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG nova.network.neutron [req-9c7d532f-4af0-4438-b2f8-4a5fdbbcd4b9 req-90bd56c9-ab0d-4cf1-936b-da1cf9040386 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updating instance_info_cache with network_info: [{"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:47:58 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9c7d532f-4af0-4438-b2f8-4a5fdbbcd4b9 req-90bd56c9-ab0d-4cf1-936b-da1cf9040386 service nova] Releasing lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:47:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:47:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:00 user nova-compute[71283]: DEBUG nova.compute.manager [req-9d61dc3b-7e88-4c54-b050-bfb9f8c1c676 req-e437fee8-a55b-4294-8695-f2faa63b288b service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Received event network-vif-plugged-5f63e57e-560d-426d-87da-b178c30bbb27 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:48:00 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9d61dc3b-7e88-4c54-b050-bfb9f8c1c676 req-e437fee8-a55b-4294-8695-f2faa63b288b service nova] Acquiring lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:48:00 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9d61dc3b-7e88-4c54-b050-bfb9f8c1c676 req-e437fee8-a55b-4294-8695-f2faa63b288b service nova] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:48:00 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9d61dc3b-7e88-4c54-b050-bfb9f8c1c676 req-e437fee8-a55b-4294-8695-f2faa63b288b service nova] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:48:00 user nova-compute[71283]: DEBUG nova.compute.manager [req-9d61dc3b-7e88-4c54-b050-bfb9f8c1c676 req-e437fee8-a55b-4294-8695-f2faa63b288b service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] No waiting events found dispatching network-vif-plugged-5f63e57e-560d-426d-87da-b178c30bbb27 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:48:00 user nova-compute[71283]: WARNING nova.compute.manager [req-9d61dc3b-7e88-4c54-b050-bfb9f8c1c676 req-e437fee8-a55b-4294-8695-f2faa63b288b service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Received unexpected event network-vif-plugged-5f63e57e-560d-426d-87da-b178c30bbb27 for instance with vm_state building and task_state spawning. Apr 20 10:48:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:00 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:48:01 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] VM Resumed (Lifecycle Event) Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.compute.manager [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:48:01 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Instance spawned successfully. Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:48:01 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:48:01 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] VM Started (Lifecycle Event) Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:48:01 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:48:01 user nova-compute[71283]: INFO nova.compute.manager [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Took 5.22 seconds to spawn the instance on the hypervisor. Apr 20 10:48:01 user nova-compute[71283]: DEBUG nova.compute.manager [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:48:02 user nova-compute[71283]: INFO nova.compute.manager [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Took 5.73 seconds to build instance. Apr 20 10:48:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-2a0a8754-5384-4f8e-a51a-bc2a5277c322 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 5.829s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:48:02 user nova-compute[71283]: DEBUG nova.compute.manager [req-9ffde888-560b-4c0d-a611-2e8287b8ac45 req-7fbb77ac-f40a-41b3-b3f8-f61493048af1 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Received event network-vif-plugged-5f63e57e-560d-426d-87da-b178c30bbb27 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:48:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9ffde888-560b-4c0d-a611-2e8287b8ac45 req-7fbb77ac-f40a-41b3-b3f8-f61493048af1 service nova] Acquiring lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:48:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9ffde888-560b-4c0d-a611-2e8287b8ac45 req-7fbb77ac-f40a-41b3-b3f8-f61493048af1 service nova] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:48:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-9ffde888-560b-4c0d-a611-2e8287b8ac45 req-7fbb77ac-f40a-41b3-b3f8-f61493048af1 service nova] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:48:02 user nova-compute[71283]: DEBUG nova.compute.manager [req-9ffde888-560b-4c0d-a611-2e8287b8ac45 req-7fbb77ac-f40a-41b3-b3f8-f61493048af1 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] No waiting events found dispatching network-vif-plugged-5f63e57e-560d-426d-87da-b178c30bbb27 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:48:02 user nova-compute[71283]: WARNING nova.compute.manager [req-9ffde888-560b-4c0d-a611-2e8287b8ac45 req-7fbb77ac-f40a-41b3-b3f8-f61493048af1 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Received unexpected event network-vif-plugged-5f63e57e-560d-426d-87da-b178c30bbb27 for instance with vm_state active and task_state None. Apr 20 10:48:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:24 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:48:25 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:48:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:28 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:48:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:30 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:48:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:48:30 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:48:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:48:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:48:30 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:48:30 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:48:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:48:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:48:30 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:48:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:48:31 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:48:31 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:48:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9129MB free_disk=26.49781036376953GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:48:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:48:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:48:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:48:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:48:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:48:31 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:48:31 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:48:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:48:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.212s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:48:32 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:48:32 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:48:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:34 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:48:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:48:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:48:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:48:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:48:34 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:48:34 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:48:35 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updating instance_info_cache with network_info: [{"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:48:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:48:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:48:36 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:48:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:48:41 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:48:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:48:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:48:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:49:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:49:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:22 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:49:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 20 10:49:22 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] There are 0 instances to clean {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 20 10:49:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:49:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:25 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:49:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:49:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:29 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:49:30 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:49:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:49:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:49:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:49:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:49:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:49:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:49:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:49:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:49:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:49:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:49:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:49:32 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:49:32 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:49:32 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9155MB free_disk=26.497173309326172GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:49:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:49:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:49:32 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:49:32 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:49:32 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:49:32 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:49:32 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:49:32 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:49:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.194s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:49:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:34 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:49:34 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:49:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:49:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:49:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:49:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:49:34 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:49:34 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:49:35 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updating instance_info_cache with network_info: [{"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:49:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:49:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:49:36 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:49:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:49:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:49:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:49:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:49:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:38 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:49:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:49:43 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:49:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances with incomplete migration {{(pid=71283) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "208031dc-1718-4e44-a856-34ceff17566c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "208031dc-1718-4e44-a856-34ceff17566c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:49:47 user nova-compute[71283]: INFO nova.compute.claims [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Claim successful on node user Apr 20 10:49:47 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG nova.network.neutron [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:49:47 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:49:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:49:47 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Creating image(s) Apr 20 10:49:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "/opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "/opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "/opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:49:47 user nova-compute[71283]: DEBUG nova.policy [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b965155fa8a453f9f3ebbf5514bb96f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '938a2ac3aa514d5ea63c1598221790f8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.134s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.127s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk 1073741824" returned: 0 in 0.045s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.178s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.151s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Checking if we can resize image /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Cannot resize image /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG nova.objects.instance [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lazy-loading 'migration_context' on Instance uuid 208031dc-1718-4e44-a856-34ceff17566c {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Ensure instance console log exists: /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:49:48 user nova-compute[71283]: DEBUG nova.network.neutron [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Successfully created port: 858caa47-a0cd-45bd-bf83-fefb4794bcb5 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.network.neutron [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Successfully updated port: 858caa47-a0cd-45bd-bf83-fefb4794bcb5 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "refresh_cache-208031dc-1718-4e44-a856-34ceff17566c" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquired lock "refresh_cache-208031dc-1718-4e44-a856-34ceff17566c" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.network.neutron [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.compute.manager [req-df2b9494-9236-4dbe-af97-6bf7fa66aad7 req-f44ec148-8823-4ddb-be1f-6c1c90a49238 service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Received event network-changed-858caa47-a0cd-45bd-bf83-fefb4794bcb5 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.compute.manager [req-df2b9494-9236-4dbe-af97-6bf7fa66aad7 req-f44ec148-8823-4ddb-be1f-6c1c90a49238 service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Refreshing instance network info cache due to event network-changed-858caa47-a0cd-45bd-bf83-fefb4794bcb5. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-df2b9494-9236-4dbe-af97-6bf7fa66aad7 req-f44ec148-8823-4ddb-be1f-6c1c90a49238 service nova] Acquiring lock "refresh_cache-208031dc-1718-4e44-a856-34ceff17566c" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.network.neutron [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.network.neutron [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Updating instance_info_cache with network_info: [{"id": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "address": "fa:16:3e:d3:27:37", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap858caa47-a0", "ovs_interfaceid": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Releasing lock "refresh_cache-208031dc-1718-4e44-a856-34ceff17566c" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.compute.manager [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Instance network_info: |[{"id": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "address": "fa:16:3e:d3:27:37", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap858caa47-a0", "ovs_interfaceid": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-df2b9494-9236-4dbe-af97-6bf7fa66aad7 req-f44ec148-8823-4ddb-be1f-6c1c90a49238 service nova] Acquired lock "refresh_cache-208031dc-1718-4e44-a856-34ceff17566c" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.network.neutron [req-df2b9494-9236-4dbe-af97-6bf7fa66aad7 req-f44ec148-8823-4ddb-be1f-6c1c90a49238 service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Refreshing network info cache for port 858caa47-a0cd-45bd-bf83-fefb4794bcb5 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Start _get_guest_xml network_info=[{"id": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "address": "fa:16:3e:d3:27:37", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap858caa47-a0", "ovs_interfaceid": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:49:49 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:49:49 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1843635375',display_name='tempest-ServersNegativeTestJSON-server-1843635375',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1843635375',id=24,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='938a2ac3aa514d5ea63c1598221790f8',ramdisk_id='',reservation_id='r-02j1s8g5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-284593456',owner_user_name='tempest-ServersNegativeTestJSON-284593456-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:49:48Z,user_data=None,user_id='8b965155fa8a453f9f3ebbf5514bb96f',uuid=208031dc-1718-4e44-a856-34ceff17566c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "address": "fa:16:3e:d3:27:37", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap858caa47-a0", "ovs_interfaceid": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Converting VIF {"id": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "address": "fa:16:3e:d3:27:37", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap858caa47-a0", "ovs_interfaceid": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:27:37,bridge_name='br-int',has_traffic_filtering=True,id=858caa47-a0cd-45bd-bf83-fefb4794bcb5,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap858caa47-a0') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.objects.instance [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lazy-loading 'pci_devices' on Instance uuid 208031dc-1718-4e44-a856-34ceff17566c {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] End _get_guest_xml xml= Apr 20 10:49:49 user nova-compute[71283]: 208031dc-1718-4e44-a856-34ceff17566c Apr 20 10:49:49 user nova-compute[71283]: instance-00000018 Apr 20 10:49:49 user nova-compute[71283]: 131072 Apr 20 10:49:49 user nova-compute[71283]: 1 Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: tempest-ServersNegativeTestJSON-server-1843635375 Apr 20 10:49:49 user nova-compute[71283]: 2023-04-20 10:49:49 Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: 128 Apr 20 10:49:49 user nova-compute[71283]: 1 Apr 20 10:49:49 user nova-compute[71283]: 0 Apr 20 10:49:49 user nova-compute[71283]: 0 Apr 20 10:49:49 user nova-compute[71283]: 1 Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: tempest-ServersNegativeTestJSON-284593456-project-member Apr 20 10:49:49 user nova-compute[71283]: tempest-ServersNegativeTestJSON-284593456 Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: OpenStack Foundation Apr 20 10:49:49 user nova-compute[71283]: OpenStack Nova Apr 20 10:49:49 user nova-compute[71283]: 0.0.0 Apr 20 10:49:49 user nova-compute[71283]: 208031dc-1718-4e44-a856-34ceff17566c Apr 20 10:49:49 user nova-compute[71283]: 208031dc-1718-4e44-a856-34ceff17566c Apr 20 10:49:49 user nova-compute[71283]: Virtual Machine Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: hvm Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Nehalem Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: /dev/urandom Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: Apr 20 10:49:49 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1843635375',display_name='tempest-ServersNegativeTestJSON-server-1843635375',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1843635375',id=24,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='938a2ac3aa514d5ea63c1598221790f8',ramdisk_id='',reservation_id='r-02j1s8g5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-284593456',owner_user_name='tempest-ServersNegativeTestJSON-284593456-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:49:48Z,user_data=None,user_id='8b965155fa8a453f9f3ebbf5514bb96f',uuid=208031dc-1718-4e44-a856-34ceff17566c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "address": "fa:16:3e:d3:27:37", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap858caa47-a0", "ovs_interfaceid": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Converting VIF {"id": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "address": "fa:16:3e:d3:27:37", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap858caa47-a0", "ovs_interfaceid": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:27:37,bridge_name='br-int',has_traffic_filtering=True,id=858caa47-a0cd-45bd-bf83-fefb4794bcb5,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap858caa47-a0') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG os_vif [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:27:37,bridge_name='br-int',has_traffic_filtering=True,id=858caa47-a0cd-45bd-bf83-fefb4794bcb5,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap858caa47-a0') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap858caa47-a0, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap858caa47-a0, col_values=(('external_ids', {'iface-id': '858caa47-a0cd-45bd-bf83-fefb4794bcb5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:27:37', 'vm-uuid': '208031dc-1718-4e44-a856-34ceff17566c'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:49 user nova-compute[71283]: INFO os_vif [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:27:37,bridge_name='br-int',has_traffic_filtering=True,id=858caa47-a0cd-45bd-bf83-fefb4794bcb5,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap858caa47-a0') Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:49:49 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] No VIF found with MAC fa:16:3e:d3:27:37, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:49:50 user nova-compute[71283]: DEBUG nova.network.neutron [req-df2b9494-9236-4dbe-af97-6bf7fa66aad7 req-f44ec148-8823-4ddb-be1f-6c1c90a49238 service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Updated VIF entry in instance network info cache for port 858caa47-a0cd-45bd-bf83-fefb4794bcb5. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:49:50 user nova-compute[71283]: DEBUG nova.network.neutron [req-df2b9494-9236-4dbe-af97-6bf7fa66aad7 req-f44ec148-8823-4ddb-be1f-6c1c90a49238 service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Updating instance_info_cache with network_info: [{"id": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "address": "fa:16:3e:d3:27:37", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap858caa47-a0", "ovs_interfaceid": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:49:50 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-df2b9494-9236-4dbe-af97-6bf7fa66aad7 req-f44ec148-8823-4ddb-be1f-6c1c90a49238 service nova] Releasing lock "refresh_cache-208031dc-1718-4e44-a856-34ceff17566c" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:49:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:51 user nova-compute[71283]: DEBUG nova.compute.manager [req-2ec0e0ea-4a9f-4c42-a1a8-253c784405a6 req-9fd89753-0f73-4bc1-8fe4-e18c8e6907e3 service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Received event network-vif-plugged-858caa47-a0cd-45bd-bf83-fefb4794bcb5 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:49:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-2ec0e0ea-4a9f-4c42-a1a8-253c784405a6 req-9fd89753-0f73-4bc1-8fe4-e18c8e6907e3 service nova] Acquiring lock "208031dc-1718-4e44-a856-34ceff17566c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:49:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-2ec0e0ea-4a9f-4c42-a1a8-253c784405a6 req-9fd89753-0f73-4bc1-8fe4-e18c8e6907e3 service nova] Lock "208031dc-1718-4e44-a856-34ceff17566c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:49:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-2ec0e0ea-4a9f-4c42-a1a8-253c784405a6 req-9fd89753-0f73-4bc1-8fe4-e18c8e6907e3 service nova] Lock "208031dc-1718-4e44-a856-34ceff17566c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:49:51 user nova-compute[71283]: DEBUG nova.compute.manager [req-2ec0e0ea-4a9f-4c42-a1a8-253c784405a6 req-9fd89753-0f73-4bc1-8fe4-e18c8e6907e3 service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] No waiting events found dispatching network-vif-plugged-858caa47-a0cd-45bd-bf83-fefb4794bcb5 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:49:51 user nova-compute[71283]: WARNING nova.compute.manager [req-2ec0e0ea-4a9f-4c42-a1a8-253c784405a6 req-9fd89753-0f73-4bc1-8fe4-e18c8e6907e3 service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Received unexpected event network-vif-plugged-858caa47-a0cd-45bd-bf83-fefb4794bcb5 for instance with vm_state building and task_state spawning. Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:49:53 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 208031dc-1718-4e44-a856-34ceff17566c] VM Resumed (Lifecycle Event) Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.compute.manager [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:49:53 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Instance spawned successfully. Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:49:53 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 208031dc-1718-4e44-a856-34ceff17566c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:49:53 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 208031dc-1718-4e44-a856-34ceff17566c] VM Started (Lifecycle Event) Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:49:53 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 208031dc-1718-4e44-a856-34ceff17566c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-b3071613-dd2c-4ab5-aed0-4b3f235c87f5 req-afa3cc75-1b98-4de6-b294-f0ab56645cce service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Received event network-vif-plugged-858caa47-a0cd-45bd-bf83-fefb4794bcb5 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b3071613-dd2c-4ab5-aed0-4b3f235c87f5 req-afa3cc75-1b98-4de6-b294-f0ab56645cce service nova] Acquiring lock "208031dc-1718-4e44-a856-34ceff17566c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b3071613-dd2c-4ab5-aed0-4b3f235c87f5 req-afa3cc75-1b98-4de6-b294-f0ab56645cce service nova] Lock "208031dc-1718-4e44-a856-34ceff17566c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-b3071613-dd2c-4ab5-aed0-4b3f235c87f5 req-afa3cc75-1b98-4de6-b294-f0ab56645cce service nova] Lock "208031dc-1718-4e44-a856-34ceff17566c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-b3071613-dd2c-4ab5-aed0-4b3f235c87f5 req-afa3cc75-1b98-4de6-b294-f0ab56645cce service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] No waiting events found dispatching network-vif-plugged-858caa47-a0cd-45bd-bf83-fefb4794bcb5 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:49:53 user nova-compute[71283]: WARNING nova.compute.manager [req-b3071613-dd2c-4ab5-aed0-4b3f235c87f5 req-afa3cc75-1b98-4de6-b294-f0ab56645cce service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Received unexpected event network-vif-plugged-858caa47-a0cd-45bd-bf83-fefb4794bcb5 for instance with vm_state building and task_state spawning. Apr 20 10:49:53 user nova-compute[71283]: INFO nova.compute.manager [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Took 5.45 seconds to spawn the instance on the hypervisor. Apr 20 10:49:53 user nova-compute[71283]: DEBUG nova.compute.manager [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:49:53 user nova-compute[71283]: INFO nova.compute.manager [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Took 5.95 seconds to build instance. Apr 20 10:49:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-9cc62249-f41f-49aa-9c12-4ddce128fa89 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "208031dc-1718-4e44-a856-34ceff17566c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.046s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:49:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:49:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:49:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:49:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:49:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:49:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:50:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:50:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:50:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:50:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:50:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:50:29 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:50:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:50:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:50:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:50:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:50:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:30 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:50:30 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:50:30 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:50:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:50:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:50:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:50:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:50:31 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:50:31 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:50:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:50:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:50:31 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:50:32 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:50:32 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:50:32 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9090MB free_disk=26.475955963134766GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 208031dc-1718-4e44-a856-34ceff17566c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing inventories for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating ProviderTree inventory for provider bdbc83bd-9307-4e20-8e3d-430b77499399 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 20 10:50:32 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating inventory in ProviderTree for provider bdbc83bd-9307-4e20-8e3d-430b77499399 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 10:50:33 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing aggregate associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, aggregates: None {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 20 10:50:33 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing trait associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 20 10:50:33 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:50:33 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:50:33 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:50:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.427s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:50:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:36 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:50:36 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:50:36 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:50:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:50:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:50:36 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:50:36 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:50:36 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updating instance_info_cache with network_info: [{"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:50:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:50:36 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:50:36 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:50:36 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:50:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:50:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:50:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:50:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:50:39 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:44 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:46 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:50:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:50:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:50:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:50:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:50:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:50:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:50:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:50:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:50:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:50:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:51:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:51:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:51:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:51:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:51:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:51:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:51:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:51:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:51:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:51:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:51:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:51:14 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:51:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:51:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:51:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:51:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:51:27 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:51:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:51:30 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:51:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:51:32 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:51:32 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:51:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:51:33 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:51:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:51:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:51:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:51:33 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:51:33 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:51:33 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:51:33 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:51:34 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:51:34 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:51:34 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:51:34 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:51:34 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:51:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:34 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:51:34 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:51:34 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9098MB free_disk=26.475311279296875GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:51:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:51:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.002s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:51:34 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:51:34 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 208031dc-1718-4e44-a856-34ceff17566c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:51:34 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:51:34 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:51:35 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:51:35 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:51:35 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:51:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.217s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:51:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "208031dc-1718-4e44-a856-34ceff17566c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:51:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "208031dc-1718-4e44-a856-34ceff17566c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:51:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "208031dc-1718-4e44-a856-34ceff17566c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:51:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "208031dc-1718-4e44-a856-34ceff17566c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:51:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "208031dc-1718-4e44-a856-34ceff17566c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:51:37 user nova-compute[71283]: INFO nova.compute.manager [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Terminating instance Apr 20 10:51:37 user nova-compute[71283]: DEBUG nova.compute.manager [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:51:37 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Skipping network cache update for instance because it is being deleted. {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9841}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Didn't find any instances for network info cache update. {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.compute.manager [req-a5282434-884a-4b00-bc48-3691613b3bd7 req-8a2b83ff-5ff5-4484-aeea-f34fbafe1f1b service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Received event network-vif-unplugged-858caa47-a0cd-45bd-bf83-fefb4794bcb5 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a5282434-884a-4b00-bc48-3691613b3bd7 req-8a2b83ff-5ff5-4484-aeea-f34fbafe1f1b service nova] Acquiring lock "208031dc-1718-4e44-a856-34ceff17566c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a5282434-884a-4b00-bc48-3691613b3bd7 req-8a2b83ff-5ff5-4484-aeea-f34fbafe1f1b service nova] Lock "208031dc-1718-4e44-a856-34ceff17566c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a5282434-884a-4b00-bc48-3691613b3bd7 req-8a2b83ff-5ff5-4484-aeea-f34fbafe1f1b service nova] Lock "208031dc-1718-4e44-a856-34ceff17566c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.compute.manager [req-a5282434-884a-4b00-bc48-3691613b3bd7 req-8a2b83ff-5ff5-4484-aeea-f34fbafe1f1b service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] No waiting events found dispatching network-vif-unplugged-858caa47-a0cd-45bd-bf83-fefb4794bcb5 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.compute.manager [req-a5282434-884a-4b00-bc48-3691613b3bd7 req-8a2b83ff-5ff5-4484-aeea-f34fbafe1f1b service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Received event network-vif-unplugged-858caa47-a0cd-45bd-bf83-fefb4794bcb5 for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:38 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Instance destroyed successfully. Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.objects.instance [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lazy-loading 'resources' on Instance uuid 208031dc-1718-4e44-a856-34ceff17566c {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:49:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1843635375',display_name='tempest-ServersNegativeTestJSON-server-1843635375',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1843635375',id=24,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T10:49:53Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='938a2ac3aa514d5ea63c1598221790f8',ramdisk_id='',reservation_id='r-02j1s8g5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-284593456',owner_user_name='tempest-ServersNegativeTestJSON-284593456-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:49:53Z,user_data=None,user_id='8b965155fa8a453f9f3ebbf5514bb96f',uuid=208031dc-1718-4e44-a856-34ceff17566c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "address": "fa:16:3e:d3:27:37", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap858caa47-a0", "ovs_interfaceid": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Converting VIF {"id": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "address": "fa:16:3e:d3:27:37", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap858caa47-a0", "ovs_interfaceid": "858caa47-a0cd-45bd-bf83-fefb4794bcb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:27:37,bridge_name='br-int',has_traffic_filtering=True,id=858caa47-a0cd-45bd-bf83-fefb4794bcb5,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap858caa47-a0') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG os_vif [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:27:37,bridge_name='br-int',has_traffic_filtering=True,id=858caa47-a0cd-45bd-bf83-fefb4794bcb5,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap858caa47-a0') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap858caa47-a0, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:51:38 user nova-compute[71283]: INFO os_vif [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:27:37,bridge_name='br-int',has_traffic_filtering=True,id=858caa47-a0cd-45bd-bf83-fefb4794bcb5,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap858caa47-a0') Apr 20 10:51:38 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Deleting instance files /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c_del Apr 20 10:51:38 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Deletion of /opt/stack/data/nova/instances/208031dc-1718-4e44-a856-34ceff17566c_del complete Apr 20 10:51:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:38 user nova-compute[71283]: INFO nova.compute.manager [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 20 10:51:38 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 208031dc-1718-4e44-a856-34ceff17566c] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:51:38 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Took 0.49 seconds to deallocate network for instance. Apr 20 10:51:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:51:38 user nova-compute[71283]: DEBUG nova.compute.manager [req-574b6f34-7525-4477-9414-b20eeba1f28e req-16acbfda-4035-475a-b4d5-35f2654f1b4f service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Received event network-vif-deleted-858caa47-a0cd-45bd-bf83-fefb4794bcb5 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:51:39 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:51:39 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:51:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:51:39 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Deleted allocations for instance 208031dc-1718-4e44-a856-34ceff17566c Apr 20 10:51:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-973b5ea3-d088-4edc-9223-37f5b4ef62c2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "208031dc-1718-4e44-a856-34ceff17566c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.455s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:51:40 user nova-compute[71283]: DEBUG nova.compute.manager [req-4b3fed1e-133f-4809-a27f-ac2054137a70 req-2e4c591d-a9e7-4c04-af79-0f22bb459339 service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Received event network-vif-plugged-858caa47-a0cd-45bd-bf83-fefb4794bcb5 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:51:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4b3fed1e-133f-4809-a27f-ac2054137a70 req-2e4c591d-a9e7-4c04-af79-0f22bb459339 service nova] Acquiring lock "208031dc-1718-4e44-a856-34ceff17566c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:51:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4b3fed1e-133f-4809-a27f-ac2054137a70 req-2e4c591d-a9e7-4c04-af79-0f22bb459339 service nova] Lock "208031dc-1718-4e44-a856-34ceff17566c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:51:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-4b3fed1e-133f-4809-a27f-ac2054137a70 req-2e4c591d-a9e7-4c04-af79-0f22bb459339 service nova] Lock "208031dc-1718-4e44-a856-34ceff17566c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:51:40 user nova-compute[71283]: DEBUG nova.compute.manager [req-4b3fed1e-133f-4809-a27f-ac2054137a70 req-2e4c591d-a9e7-4c04-af79-0f22bb459339 service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] No waiting events found dispatching network-vif-plugged-858caa47-a0cd-45bd-bf83-fefb4794bcb5 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:51:40 user nova-compute[71283]: WARNING nova.compute.manager [req-4b3fed1e-133f-4809-a27f-ac2054137a70 req-2e4c591d-a9e7-4c04-af79-0f22bb459339 service nova] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Received unexpected event network-vif-plugged-858caa47-a0cd-45bd-bf83-fefb4794bcb5 for instance with vm_state deleted and task_state None. Apr 20 10:51:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:51:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:51:53 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:51:53 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 208031dc-1718-4e44-a856-34ceff17566c] VM Stopped (Lifecycle Event) Apr 20 10:51:53 user nova-compute[71283]: DEBUG nova.compute.manager [None req-8c0757d6-ac4c-4a03-b1bd-2225d6aa3b0e None None] [instance: 208031dc-1718-4e44-a856-34ceff17566c] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:51:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:51:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:52:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:52:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:52:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:52:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:52:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:52:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:52:29 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:52:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:52:32 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:52:32 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:52:32 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:52:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:52:33 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:52:35 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:52:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:52:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:52:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:52:35 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:52:35 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:52:35 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:52:35 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:52:36 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.154s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:52:36 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:52:36 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:52:36 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9172MB free_disk=26.494319915771484GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:52:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:52:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:52:36 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:52:36 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:52:36 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:52:36 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:52:36 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:52:36 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:52:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.215s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:52:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:52:38 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:52:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:52:38 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:52:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:52:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:52:38 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:52:38 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:52:39 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updating instance_info_cache with network_info: [{"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:52:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:52:39 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:52:39 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:52:39 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:52:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:52:46 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:52:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:52:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:52:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:53:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:53:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:53:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:53:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:53:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:53:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:53:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:53:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:53:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:53:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:53:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:53:29 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:53:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:53:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:53:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:53:34 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:53:34 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:53:34 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:53:34 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:53:36 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:53:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:53:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:53:36 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:53:36 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:53:36 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:53:36 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:53:36 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:53:37 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:53:37 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:53:37 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:53:37 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9176MB free_disk=26.493698120117188GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:53:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:53:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:53:37 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:53:37 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:53:37 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:53:37 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:53:37 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:53:37 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:53:37 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.151s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:53:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:53:39 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:53:39 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:53:39 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:53:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:53:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:53:39 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:53:39 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:53:40 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updating instance_info_cache with network_info: [{"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:53:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:53:40 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:53:40 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:53:40 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:53:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:53:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:53:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:53:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:54:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:54:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:54:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:54:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:54:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:54:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:54:29 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 20 10:54:29 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] There are 0 instances to clean {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 20 10:54:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:31 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:32 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:54:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:54:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:54:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:54:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:54:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:54:34 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:35 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:35 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:54:36 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:54:38 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:54:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:54:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:54:38 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:54:38 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:54:38 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:54:38 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:54:39 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:54:39 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:54:39 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:54:39 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9182MB free_disk=26.49309539794922GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:54:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:54:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:54:39 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:54:39 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:54:39 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:54:39 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:54:39 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:54:39 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:54:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:54:41 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:41 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:54:41 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:54:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:54:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:54:41 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:54:41 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:54:42 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updating instance_info_cache with network_info: [{"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:54:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:54:42 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:54:42 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:42 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:42 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:54:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:54:50 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:54:55 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:54:55 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances with incomplete migration {{(pid=71283) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 20 10:54:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:55:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:55:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:55:09 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_power_states {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:55:09 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Triggering sync for uuid 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 {{(pid=71283) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 10:55:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:55:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:55:09 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.026s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:55:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:55:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:55:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:55:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:55:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:55:33 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:55:33 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:55:35 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:55:37 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:55:37 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:55:37 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:55:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:55:38 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:55:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:55:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:55:38 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:55:38 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:55:38 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:55:38 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:55:38 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:55:39 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:55:39 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:55:39 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9174MB free_disk=26.484691619873047GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing inventories for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating ProviderTree inventory for provider bdbc83bd-9307-4e20-8e3d-430b77499399 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating inventory in ProviderTree for provider bdbc83bd-9307-4e20-8e3d-430b77499399 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing aggregate associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, aggregates: None {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing trait associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:55:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.444s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:55:41 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:55:41 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:55:42 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:55:42 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:55:42 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:55:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:55:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:55:42 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:55:42 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:55:43 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updating instance_info_cache with network_info: [{"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:55:43 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-6a0d597a-4eea-4938-92c6-c17d0a6c82e5" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:55:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:55:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:55:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:55:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:55:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:56:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:56:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:56:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:56:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:56:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:56:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:56:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:56:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:56:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:56:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:56:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:56:33 user nova-compute[71283]: INFO nova.compute.manager [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Terminating instance Apr 20 10:56:33 user nova-compute[71283]: DEBUG nova.compute.manager [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG nova.compute.manager [req-a8f23d70-fb9b-4afb-967e-600f9d23b932 req-554ddce6-631d-47dd-911c-8dd64bbd7ea6 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Received event network-vif-unplugged-5f63e57e-560d-426d-87da-b178c30bbb27 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a8f23d70-fb9b-4afb-967e-600f9d23b932 req-554ddce6-631d-47dd-911c-8dd64bbd7ea6 service nova] Acquiring lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a8f23d70-fb9b-4afb-967e-600f9d23b932 req-554ddce6-631d-47dd-911c-8dd64bbd7ea6 service nova] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-a8f23d70-fb9b-4afb-967e-600f9d23b932 req-554ddce6-631d-47dd-911c-8dd64bbd7ea6 service nova] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG nova.compute.manager [req-a8f23d70-fb9b-4afb-967e-600f9d23b932 req-554ddce6-631d-47dd-911c-8dd64bbd7ea6 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] No waiting events found dispatching network-vif-unplugged-5f63e57e-560d-426d-87da-b178c30bbb27 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG nova.compute.manager [req-a8f23d70-fb9b-4afb-967e-600f9d23b932 req-554ddce6-631d-47dd-911c-8dd64bbd7ea6 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Received event network-vif-unplugged-5f63e57e-560d-426d-87da-b178c30bbb27 for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:56:33 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Instance destroyed successfully. Apr 20 10:56:33 user nova-compute[71283]: DEBUG nova.objects.instance [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lazy-loading 'resources' on Instance uuid 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:47:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-504126875',display_name='tempest-ServersNegativeTestJSON-server-504126875',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-504126875',id=23,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T10:48:01Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='938a2ac3aa514d5ea63c1598221790f8',ramdisk_id='',reservation_id='r-y9va7a3q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-284593456',owner_user_name='tempest-ServersNegativeTestJSON-284593456-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:48:02Z,user_data=None,user_id='8b965155fa8a453f9f3ebbf5514bb96f',uuid=6a0d597a-4eea-4938-92c6-c17d0a6c82e5,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Converting VIF {"id": "5f63e57e-560d-426d-87da-b178c30bbb27", "address": "fa:16:3e:2b:1c:e8", "network": {"id": "94565cf9-f25f-4733-96b4-8ec0ecfda684", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1521378163-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "938a2ac3aa514d5ea63c1598221790f8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f63e57e-56", "ovs_interfaceid": "5f63e57e-560d-426d-87da-b178c30bbb27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2b:1c:e8,bridge_name='br-int',has_traffic_filtering=True,id=5f63e57e-560d-426d-87da-b178c30bbb27,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f63e57e-56') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG os_vif [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:1c:e8,bridge_name='br-int',has_traffic_filtering=True,id=5f63e57e-560d-426d-87da-b178c30bbb27,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f63e57e-56') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f63e57e-56, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:56:33 user nova-compute[71283]: INFO os_vif [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:1c:e8,bridge_name='br-int',has_traffic_filtering=True,id=5f63e57e-560d-426d-87da-b178c30bbb27,network=Network(94565cf9-f25f-4733-96b4-8ec0ecfda684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f63e57e-56') Apr 20 10:56:33 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Deleting instance files /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5_del Apr 20 10:56:33 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Deletion of /opt/stack/data/nova/instances/6a0d597a-4eea-4938-92c6-c17d0a6c82e5_del complete Apr 20 10:56:33 user nova-compute[71283]: INFO nova.compute.manager [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Took 0.87 seconds to destroy the instance on the hypervisor. Apr 20 10:56:33 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:56:33 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:56:34 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:56:34 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Took 0.50 seconds to deallocate network for instance. Apr 20 10:56:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:56:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:56:34 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:56:34 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:56:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:56:34 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Deleted allocations for instance 6a0d597a-4eea-4938-92c6-c17d0a6c82e5 Apr 20 10:56:34 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-27bb2d72-a227-495a-8ea0-209fa8cb5bd2 tempest-ServersNegativeTestJSON-284593456 tempest-ServersNegativeTestJSON-284593456-project-member] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.692s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:56:35 user nova-compute[71283]: DEBUG nova.compute.manager [req-e784eddb-0bc6-4519-be09-e6b9b8c9afe1 req-2ef96aad-692f-4305-8b51-73ab8dcabc20 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Received event network-vif-plugged-5f63e57e-560d-426d-87da-b178c30bbb27 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:56:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-e784eddb-0bc6-4519-be09-e6b9b8c9afe1 req-2ef96aad-692f-4305-8b51-73ab8dcabc20 service nova] Acquiring lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:56:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-e784eddb-0bc6-4519-be09-e6b9b8c9afe1 req-2ef96aad-692f-4305-8b51-73ab8dcabc20 service nova] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:56:35 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-e784eddb-0bc6-4519-be09-e6b9b8c9afe1 req-2ef96aad-692f-4305-8b51-73ab8dcabc20 service nova] Lock "6a0d597a-4eea-4938-92c6-c17d0a6c82e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:56:35 user nova-compute[71283]: DEBUG nova.compute.manager [req-e784eddb-0bc6-4519-be09-e6b9b8c9afe1 req-2ef96aad-692f-4305-8b51-73ab8dcabc20 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] No waiting events found dispatching network-vif-plugged-5f63e57e-560d-426d-87da-b178c30bbb27 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:56:35 user nova-compute[71283]: WARNING nova.compute.manager [req-e784eddb-0bc6-4519-be09-e6b9b8c9afe1 req-2ef96aad-692f-4305-8b51-73ab8dcabc20 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Received unexpected event network-vif-plugged-5f63e57e-560d-426d-87da-b178c30bbb27 for instance with vm_state deleted and task_state None. Apr 20 10:56:35 user nova-compute[71283]: DEBUG nova.compute.manager [req-e784eddb-0bc6-4519-be09-e6b9b8c9afe1 req-2ef96aad-692f-4305-8b51-73ab8dcabc20 service nova] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Received event network-vif-deleted-5f63e57e-560d-426d-87da-b178c30bbb27 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:56:35 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:56:36 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:56:37 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:56:37 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:56:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:56:39 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:56:39 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:56:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:56:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:56:39 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:56:39 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:56:40 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:56:40 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:56:40 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9250MB free_disk=26.503631591796875GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:56:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:56:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:56:40 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:56:40 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:56:40 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:56:40 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:56:40 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:56:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:56:41 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:56:41 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:56:43 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:56:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:56:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:56:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Didn't find any instances for network info cache update. {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 10:56:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:56:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:56:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:56:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:56:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:56:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:56:48 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:56:48 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] VM Stopped (Lifecycle Event) Apr 20 10:56:48 user nova-compute[71283]: DEBUG nova.compute.manager [None req-a941367c-bc1e-44ae-b84f-1aeb0d2569ad None None] [instance: 6a0d597a-4eea-4938-92c6-c17d0a6c82e5] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:56:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:56:52 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:56:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:56:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:57:08 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:57:13 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:18 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:57:23 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:25 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:28 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:57:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:57:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:57:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:57:33 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:57:34 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:57:37 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:57:37 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:57:37 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:57:38 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:57:38 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:40 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:57:40 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:57:40 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:57:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:57:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:57:40 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:57:40 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:57:41 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:57:41 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:57:41 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9249MB free_disk=26.501811981201172GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:57:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:57:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:57:41 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:57:41 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:57:41 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:57:41 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:57:41 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:57:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:57:43 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:57:43 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:57:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:57:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:57:43 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Didn't find any instances for network info cache update. {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 10:57:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:57:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:57:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:57:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:57:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:48 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:49 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:53 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:56 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:58 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Acquiring lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Starting instance... {{(pid=71283) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71283) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 10:57:59 user nova-compute[71283]: INFO nova.compute.claims [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Claim successful on node user Apr 20 10:57:59 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Start building networks asynchronously for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Allocating IP information in the background. {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG nova.network.neutron [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] allocate_for_instance() {{(pid=71283) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 10:57:59 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 10:57:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Start building block device mappings for instance. {{(pid=71283) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG nova.policy [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1efe465dafde459899d76107759d64e9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '730b5f2c76df4e769572f6296407c103', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71283) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Start spawning the instance on the hypervisor. {{(pid=71283) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Creating instance directory {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 10:57:59 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Creating image(s) Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Acquiring lock "/opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "/opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "/opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.135s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Acquiring lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.126s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:57:59 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk 1073741824 {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113,backing_fmt=raw /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk 1073741824" returned: 0 in 0.045s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "7797fd71afd0d2d63c1e53d7f4988b954ed93113" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.177s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7797fd71afd0d2d63c1e53d7f4988b954ed93113 --force-share --output=json" returned: 0 in 0.126s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Checking if we can resize image /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk. size=1073741824 {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG nova.virt.disk.api [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Cannot resize image /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk to a smaller size. {{(pid=71283) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG nova.objects.instance [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lazy-loading 'migration_context' on Instance uuid 2874b7cd-8c61-46d5-a949-8a8c2469b9af {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Created local disks {{(pid=71283) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Ensure instance console log exists: /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/console.log {{(pid=71283) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:58:00 user nova-compute[71283]: DEBUG nova.network.neutron [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Successfully created port: 7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.network.neutron [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Successfully updated port: 7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Acquiring lock "refresh_cache-2874b7cd-8c61-46d5-a949-8a8c2469b9af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Acquired lock "refresh_cache-2874b7cd-8c61-46d5-a949-8a8c2469b9af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.network.neutron [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Building network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.network.neutron [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Instance cache missing network info. {{(pid=71283) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.compute.manager [req-2078e028-bb30-4ea3-b086-a06c62dd4bc4 req-74a8d581-7f37-491c-8ead-0f9fcd7cbdab service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received event network-changed-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.compute.manager [req-2078e028-bb30-4ea3-b086-a06c62dd4bc4 req-74a8d581-7f37-491c-8ead-0f9fcd7cbdab service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Refreshing instance network info cache due to event network-changed-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530. {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-2078e028-bb30-4ea3-b086-a06c62dd4bc4 req-74a8d581-7f37-491c-8ead-0f9fcd7cbdab service nova] Acquiring lock "refresh_cache-2874b7cd-8c61-46d5-a949-8a8c2469b9af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.network.neutron [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Updating instance_info_cache with network_info: [{"id": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "address": "fa:16:3e:5e:e5:c2", "network": {"id": "52844f20-cd97-4994-ae95-56c25466dfd5", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-534067309-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "730b5f2c76df4e769572f6296407c103", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a0b25cb-ad", "ovs_interfaceid": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Releasing lock "refresh_cache-2874b7cd-8c61-46d5-a949-8a8c2469b9af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Instance network_info: |[{"id": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "address": "fa:16:3e:5e:e5:c2", "network": {"id": "52844f20-cd97-4994-ae95-56c25466dfd5", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-534067309-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "730b5f2c76df4e769572f6296407c103", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a0b25cb-ad", "ovs_interfaceid": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71283) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-2078e028-bb30-4ea3-b086-a06c62dd4bc4 req-74a8d581-7f37-491c-8ead-0f9fcd7cbdab service nova] Acquired lock "refresh_cache-2874b7cd-8c61-46d5-a949-8a8c2469b9af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.network.neutron [req-2078e028-bb30-4ea3-b086-a06c62dd4bc4 req-74a8d581-7f37-491c-8ead-0f9fcd7cbdab service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Refreshing network info cache for port 7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Start _get_guest_xml network_info=[{"id": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "address": "fa:16:3e:5e:e5:c2", "network": {"id": "52844f20-cd97-4994-ae95-56c25466dfd5", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-534067309-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "730b5f2c76df4e769572f6296407c103", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a0b25cb-ad", "ovs_interfaceid": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'encryption_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '3687b278-c2bc-46f2-9b7e-579c8d06fe41'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 10:58:01 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:58:01 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71283) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T10:25:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T10:24:36Z,direct_url=,disk_format='qcow2',id=3687b278-c2bc-46f2-9b7e-579c8d06fe41,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='fda053593ca94fea8fa552c5a6c6e5d5',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T10:24:37Z,virtual_size=,visibility=), allow threads: True {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Flavor limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Image limits 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Flavor pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Image pref 0:0:0 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71283) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Got 1 possible topologies {{(pid=71283) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.hardware [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71283) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:57:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-1487722819',display_name='tempest-SnapshotDataIntegrityTests-server-1487722819',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-1487722819',id=25,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMBPIT+VvH3x2WPe1QtbwJQBTzLEjebnVimjNNjbfbb27yWLPSdAy/C41BLBv8diKVmHQXGuhKLbD0OVFASZoKFC6Q50u+178oCkH5VTSnQCVI+WIQW8XWFJi1FKNNpilg==',key_name='tempest-SnapshotDataIntegrityTests-590761624',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='730b5f2c76df4e769572f6296407c103',ramdisk_id='',reservation_id='r-wg7k0j6k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-268465305',owner_user_name='tempest-SnapshotDataIntegrityTests-268465305-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:58:00Z,user_data=None,user_id='1efe465dafde459899d76107759d64e9',uuid=2874b7cd-8c61-46d5-a949-8a8c2469b9af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "address": "fa:16:3e:5e:e5:c2", "network": {"id": "52844f20-cd97-4994-ae95-56c25466dfd5", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-534067309-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "730b5f2c76df4e769572f6296407c103", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a0b25cb-ad", "ovs_interfaceid": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71283) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Converting VIF {"id": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "address": "fa:16:3e:5e:e5:c2", "network": {"id": "52844f20-cd97-4994-ae95-56c25466dfd5", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-534067309-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "730b5f2c76df4e769572f6296407c103", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a0b25cb-ad", "ovs_interfaceid": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:e5:c2,bridge_name='br-int',has_traffic_filtering=True,id=7a0b25cb-ada6-4f3c-bdd7-a9c65d503530,network=Network(52844f20-cd97-4994-ae95-56c25466dfd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a0b25cb-ad') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.objects.instance [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lazy-loading 'pci_devices' on Instance uuid 2874b7cd-8c61-46d5-a949-8a8c2469b9af {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] End _get_guest_xml xml= Apr 20 10:58:01 user nova-compute[71283]: 2874b7cd-8c61-46d5-a949-8a8c2469b9af Apr 20 10:58:01 user nova-compute[71283]: instance-00000019 Apr 20 10:58:01 user nova-compute[71283]: 131072 Apr 20 10:58:01 user nova-compute[71283]: 1 Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: tempest-SnapshotDataIntegrityTests-server-1487722819 Apr 20 10:58:01 user nova-compute[71283]: 2023-04-20 10:58:01 Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: 128 Apr 20 10:58:01 user nova-compute[71283]: 1 Apr 20 10:58:01 user nova-compute[71283]: 0 Apr 20 10:58:01 user nova-compute[71283]: 0 Apr 20 10:58:01 user nova-compute[71283]: 1 Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: tempest-SnapshotDataIntegrityTests-268465305-project-member Apr 20 10:58:01 user nova-compute[71283]: tempest-SnapshotDataIntegrityTests-268465305 Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: OpenStack Foundation Apr 20 10:58:01 user nova-compute[71283]: OpenStack Nova Apr 20 10:58:01 user nova-compute[71283]: 0.0.0 Apr 20 10:58:01 user nova-compute[71283]: 2874b7cd-8c61-46d5-a949-8a8c2469b9af Apr 20 10:58:01 user nova-compute[71283]: 2874b7cd-8c61-46d5-a949-8a8c2469b9af Apr 20 10:58:01 user nova-compute[71283]: Virtual Machine Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: hvm Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Nehalem Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: /dev/urandom Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: Apr 20 10:58:01 user nova-compute[71283]: {{(pid=71283) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:57:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-1487722819',display_name='tempest-SnapshotDataIntegrityTests-server-1487722819',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-1487722819',id=25,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMBPIT+VvH3x2WPe1QtbwJQBTzLEjebnVimjNNjbfbb27yWLPSdAy/C41BLBv8diKVmHQXGuhKLbD0OVFASZoKFC6Q50u+178oCkH5VTSnQCVI+WIQW8XWFJi1FKNNpilg==',key_name='tempest-SnapshotDataIntegrityTests-590761624',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='730b5f2c76df4e769572f6296407c103',ramdisk_id='',reservation_id='r-wg7k0j6k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-268465305',owner_user_name='tempest-SnapshotDataIntegrityTests-268465305-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T10:58:00Z,user_data=None,user_id='1efe465dafde459899d76107759d64e9',uuid=2874b7cd-8c61-46d5-a949-8a8c2469b9af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "address": "fa:16:3e:5e:e5:c2", "network": {"id": "52844f20-cd97-4994-ae95-56c25466dfd5", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-534067309-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "730b5f2c76df4e769572f6296407c103", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a0b25cb-ad", "ovs_interfaceid": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Converting VIF {"id": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "address": "fa:16:3e:5e:e5:c2", "network": {"id": "52844f20-cd97-4994-ae95-56c25466dfd5", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-534067309-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "730b5f2c76df4e769572f6296407c103", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a0b25cb-ad", "ovs_interfaceid": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:e5:c2,bridge_name='br-int',has_traffic_filtering=True,id=7a0b25cb-ada6-4f3c-bdd7-a9c65d503530,network=Network(52844f20-cd97-4994-ae95-56c25466dfd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a0b25cb-ad') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG os_vif [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:e5:c2,bridge_name='br-int',has_traffic_filtering=True,id=7a0b25cb-ada6-4f3c-bdd7-a9c65d503530,network=Network(52844f20-cd97-4994-ae95-56c25466dfd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a0b25cb-ad') {{(pid=71283) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a0b25cb-ad, may_exist=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7a0b25cb-ad, col_values=(('external_ids', {'iface-id': '7a0b25cb-ada6-4f3c-bdd7-a9c65d503530', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:e5:c2', 'vm-uuid': '2874b7cd-8c61-46d5-a949-8a8c2469b9af'}),)) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:01 user nova-compute[71283]: INFO os_vif [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:e5:c2,bridge_name='br-int',has_traffic_filtering=True,id=7a0b25cb-ada6-4f3c-bdd7-a9c65d503530,network=Network(52844f20-cd97-4994-ae95-56c25466dfd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a0b25cb-ad') Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] No BDM found with device name vda, not building metadata. {{(pid=71283) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 10:58:01 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] No VIF found with MAC fa:16:3e:5e:e5:c2, not building metadata {{(pid=71283) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 10:58:02 user nova-compute[71283]: DEBUG nova.network.neutron [req-2078e028-bb30-4ea3-b086-a06c62dd4bc4 req-74a8d581-7f37-491c-8ead-0f9fcd7cbdab service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Updated VIF entry in instance network info cache for port 7a0b25cb-ada6-4f3c-bdd7-a9c65d503530. {{(pid=71283) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 10:58:02 user nova-compute[71283]: DEBUG nova.network.neutron [req-2078e028-bb30-4ea3-b086-a06c62dd4bc4 req-74a8d581-7f37-491c-8ead-0f9fcd7cbdab service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Updating instance_info_cache with network_info: [{"id": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "address": "fa:16:3e:5e:e5:c2", "network": {"id": "52844f20-cd97-4994-ae95-56c25466dfd5", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-534067309-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "730b5f2c76df4e769572f6296407c103", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a0b25cb-ad", "ovs_interfaceid": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:58:02 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-2078e028-bb30-4ea3-b086-a06c62dd4bc4 req-74a8d581-7f37-491c-8ead-0f9fcd7cbdab service nova] Releasing lock "refresh_cache-2874b7cd-8c61-46d5-a949-8a8c2469b9af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:58:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:02 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:03 user nova-compute[71283]: DEBUG nova.compute.manager [req-0537b6c4-9264-418f-a813-f790ff35ed43 req-05534082-6ed7-4bd9-b473-57688b9e7037 service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received event network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:58:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0537b6c4-9264-418f-a813-f790ff35ed43 req-05534082-6ed7-4bd9-b473-57688b9e7037 service nova] Acquiring lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:58:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0537b6c4-9264-418f-a813-f790ff35ed43 req-05534082-6ed7-4bd9-b473-57688b9e7037 service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:58:03 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0537b6c4-9264-418f-a813-f790ff35ed43 req-05534082-6ed7-4bd9-b473-57688b9e7037 service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:58:03 user nova-compute[71283]: DEBUG nova.compute.manager [req-0537b6c4-9264-418f-a813-f790ff35ed43 req-05534082-6ed7-4bd9-b473-57688b9e7037 service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] No waiting events found dispatching network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:58:03 user nova-compute[71283]: WARNING nova.compute.manager [req-0537b6c4-9264-418f-a813-f790ff35ed43 req-05534082-6ed7-4bd9-b473-57688b9e7037 service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received unexpected event network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 for instance with vm_state building and task_state spawning. Apr 20 10:58:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:03 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:04 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Resumed> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:58:04 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] VM Resumed (Lifecycle Event) Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Instance event wait completed in 0 seconds for {{(pid=71283) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Guest created on hypervisor {{(pid=71283) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 10:58:04 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Instance spawned successfully. Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Found default for hw_cdrom_bus of ide {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Found default for hw_disk_bus of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Found default for hw_input_bus of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Found default for hw_pointer_model of None {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Found default for hw_video_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.virt.libvirt.driver [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Found default for hw_vif_model of virtio {{(pid=71283) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 10:58:04 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.virt.driver [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] Emitting event Started> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 10:58:04 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] VM Started (Lifecycle Event) Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:58:04 user nova-compute[71283]: DEBUG nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71283) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 10:58:04 user nova-compute[71283]: INFO nova.compute.manager [None req-86d10cff-c630-4aa3-b8c4-d0b68b93e525 None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 10:58:05 user nova-compute[71283]: INFO nova.compute.manager [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Took 5.34 seconds to spawn the instance on the hypervisor. Apr 20 10:58:05 user nova-compute[71283]: DEBUG nova.compute.manager [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 10:58:05 user nova-compute[71283]: INFO nova.compute.manager [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Took 5.84 seconds to build instance. Apr 20 10:58:05 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-e1b4df8f-7bf7-42b9-ba77-f934574438c7 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 5.936s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:58:05 user nova-compute[71283]: DEBUG nova.compute.manager [req-ad6565e4-c854-4111-b7c9-67d170d7f737 req-ebdb60cc-a112-4760-a319-5a88a1a0993c service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received event network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:58:05 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ad6565e4-c854-4111-b7c9-67d170d7f737 req-ebdb60cc-a112-4760-a319-5a88a1a0993c service nova] Acquiring lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:58:05 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ad6565e4-c854-4111-b7c9-67d170d7f737 req-ebdb60cc-a112-4760-a319-5a88a1a0993c service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:58:05 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-ad6565e4-c854-4111-b7c9-67d170d7f737 req-ebdb60cc-a112-4760-a319-5a88a1a0993c service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:58:05 user nova-compute[71283]: DEBUG nova.compute.manager [req-ad6565e4-c854-4111-b7c9-67d170d7f737 req-ebdb60cc-a112-4760-a319-5a88a1a0993c service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] No waiting events found dispatching network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:58:05 user nova-compute[71283]: WARNING nova.compute.manager [req-ad6565e4-c854-4111-b7c9-67d170d7f737 req-ebdb60cc-a112-4760-a319-5a88a1a0993c service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received unexpected event network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 for instance with vm_state active and task_state None. Apr 20 10:58:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:09 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:58:19 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:24 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:31 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:58:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:35 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:58:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:37 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:58:38 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:58:39 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:58:39 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:58:41 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:58:42 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:58:42 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:58:42 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:58:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9129MB free_disk=26.471580505371094GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:58:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:58:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:58:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 2874b7cd-8c61-46d5-a949-8a8c2469b9af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:58:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:58:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:58:42 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:58:42 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:58:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:58:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.209s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:58:43 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:58:43 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:58:43 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:58:45 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:58:45 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:58:45 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:58:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-2874b7cd-8c61-46d5-a949-8a8c2469b9af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:58:45 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-2874b7cd-8c61-46d5-a949-8a8c2469b9af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:58:45 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:58:45 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid 2874b7cd-8c61-46d5-a949-8a8c2469b9af {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:58:46 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Updating instance_info_cache with network_info: [{"id": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "address": "fa:16:3e:5e:e5:c2", "network": {"id": "52844f20-cd97-4994-ae95-56c25466dfd5", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-534067309-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "730b5f2c76df4e769572f6296407c103", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a0b25cb-ad", "ovs_interfaceid": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:58:46 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-2874b7cd-8c61-46d5-a949-8a8c2469b9af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:58:46 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:58:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:54 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:58:54 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:58:56 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:59:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:59:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:59:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:59:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:59:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:59:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:59:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:59:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:59:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:59:31 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:34 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:37 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:59:37 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:59:38 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:59:40 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:59:40 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:59:41 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk --force-share --output=json {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG oslo_concurrency.processutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71283) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 10:59:42 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:59:42 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 10:59:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9151MB free_disk=26.47100067138672GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Instance 2874b7cd-8c61-46d5-a949-8a8c2469b9af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71283) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] There are 0 instances to clean {{(pid=71283) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 20 10:59:42 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:59:43 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:59:44 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:59:44 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:59:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:46 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:59:46 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 10:59:46 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 10:59:46 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "refresh_cache-2874b7cd-8c61-46d5-a949-8a8c2469b9af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 10:59:46 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquired lock "refresh_cache-2874b7cd-8c61-46d5-a949-8a8c2469b9af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 10:59:46 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Forcefully refreshing network info cache for instance {{(pid=71283) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 10:59:46 user nova-compute[71283]: DEBUG nova.objects.instance [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lazy-loading 'info_cache' on Instance uuid 2874b7cd-8c61-46d5-a949-8a8c2469b9af {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:59:47 user nova-compute[71283]: DEBUG nova.network.neutron [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Updating instance_info_cache with network_info: [{"id": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "address": "fa:16:3e:5e:e5:c2", "network": {"id": "52844f20-cd97-4994-ae95-56c25466dfd5", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-534067309-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "730b5f2c76df4e769572f6296407c103", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a0b25cb-ad", "ovs_interfaceid": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:59:47 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Releasing lock "refresh_cache-2874b7cd-8c61-46d5-a949-8a8c2469b9af" {{(pid=71283) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 10:59:47 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Updated the network info_cache for instance {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Acquiring lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Acquiring lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:59:51 user nova-compute[71283]: INFO nova.compute.manager [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Terminating instance Apr 20 10:59:51 user nova-compute[71283]: DEBUG nova.compute.manager [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Start destroying the instance on the hypervisor. {{(pid=71283) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG nova.compute.manager [req-0e603176-a582-49b1-8c3d-8451e1a92737 req-0525f720-82d1-4d4e-9b81-e3c35a666cf2 service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received event network-vif-unplugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e603176-a582-49b1-8c3d-8451e1a92737 req-0525f720-82d1-4d4e-9b81-e3c35a666cf2 service nova] Acquiring lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e603176-a582-49b1-8c3d-8451e1a92737 req-0525f720-82d1-4d4e-9b81-e3c35a666cf2 service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-0e603176-a582-49b1-8c3d-8451e1a92737 req-0525f720-82d1-4d4e-9b81-e3c35a666cf2 service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG nova.compute.manager [req-0e603176-a582-49b1-8c3d-8451e1a92737 req-0525f720-82d1-4d4e-9b81-e3c35a666cf2 service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] No waiting events found dispatching network-vif-unplugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG nova.compute.manager [req-0e603176-a582-49b1-8c3d-8451e1a92737 req-0525f720-82d1-4d4e-9b81-e3c35a666cf2 service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received event network-vif-unplugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 for instance with task_state deleting. {{(pid=71283) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:51 user nova-compute[71283]: INFO nova.virt.libvirt.driver [-] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Instance destroyed successfully. Apr 20 10:59:51 user nova-compute[71283]: DEBUG nova.objects.instance [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lazy-loading 'resources' on Instance uuid 2874b7cd-8c61-46d5-a949-8a8c2469b9af {{(pid=71283) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG nova.virt.libvirt.vif [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T10:57:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-1487722819',display_name='tempest-SnapshotDataIntegrityTests-server-1487722819',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-1487722819',id=25,image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMBPIT+VvH3x2WPe1QtbwJQBTzLEjebnVimjNNjbfbb27yWLPSdAy/C41BLBv8diKVmHQXGuhKLbD0OVFASZoKFC6Q50u+178oCkH5VTSnQCVI+WIQW8XWFJi1FKNNpilg==',key_name='tempest-SnapshotDataIntegrityTests-590761624',keypairs=,launch_index=0,launched_at=2023-04-20T10:58:05Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='730b5f2c76df4e769572f6296407c103',ramdisk_id='',reservation_id='r-wg7k0j6k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3687b278-c2bc-46f2-9b7e-579c8d06fe41',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-SnapshotDataIntegrityTests-268465305',owner_user_name='tempest-SnapshotDataIntegrityTests-268465305-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T10:58:05Z,user_data=None,user_id='1efe465dafde459899d76107759d64e9',uuid=2874b7cd-8c61-46d5-a949-8a8c2469b9af,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "address": "fa:16:3e:5e:e5:c2", "network": {"id": "52844f20-cd97-4994-ae95-56c25466dfd5", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-534067309-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "730b5f2c76df4e769572f6296407c103", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a0b25cb-ad", "ovs_interfaceid": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Converting VIF {"id": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "address": "fa:16:3e:5e:e5:c2", "network": {"id": "52844f20-cd97-4994-ae95-56c25466dfd5", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-534067309-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "730b5f2c76df4e769572f6296407c103", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a0b25cb-ad", "ovs_interfaceid": "7a0b25cb-ada6-4f3c-bdd7-a9c65d503530", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG nova.network.os_vif_util [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5e:e5:c2,bridge_name='br-int',has_traffic_filtering=True,id=7a0b25cb-ada6-4f3c-bdd7-a9c65d503530,network=Network(52844f20-cd97-4994-ae95-56c25466dfd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a0b25cb-ad') {{(pid=71283) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG os_vif [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:e5:c2,bridge_name='br-int',has_traffic_filtering=True,id=7a0b25cb-ada6-4f3c-bdd7-a9c65d503530,network=Network(52844f20-cd97-4994-ae95-56c25466dfd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a0b25cb-ad') {{(pid=71283) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a0b25cb-ad, bridge=br-int, if_exists=True) {{(pid=71283) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 10:59:51 user nova-compute[71283]: INFO os_vif [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:e5:c2,bridge_name='br-int',has_traffic_filtering=True,id=7a0b25cb-ada6-4f3c-bdd7-a9c65d503530,network=Network(52844f20-cd97-4994-ae95-56c25466dfd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a0b25cb-ad') Apr 20 10:59:51 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Deleting instance files /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af_del Apr 20 10:59:51 user nova-compute[71283]: INFO nova.virt.libvirt.driver [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Deletion of /opt/stack/data/nova/instances/2874b7cd-8c61-46d5-a949-8a8c2469b9af_del complete Apr 20 10:59:51 user nova-compute[71283]: INFO nova.compute.manager [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Took 0.68 seconds to destroy the instance on the hypervisor. Apr 20 10:59:51 user nova-compute[71283]: DEBUG oslo.service.loopingcall [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71283) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG nova.compute.manager [-] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Deallocating network for instance {{(pid=71283) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 10:59:51 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] deallocate_for_instance() {{(pid=71283) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 10:59:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:52 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 10:59:52 user nova-compute[71283]: DEBUG nova.network.neutron [-] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Updating instance_info_cache with network_info: [] {{(pid=71283) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 10:59:52 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Took 0.91 seconds to deallocate network for instance. Apr 20 10:59:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:59:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:59:52 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 10:59:52 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 10:59:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:59:52 user nova-compute[71283]: INFO nova.scheduler.client.report [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Deleted allocations for instance 2874b7cd-8c61-46d5-a949-8a8c2469b9af Apr 20 10:59:52 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-4139d69d-c636-4f9a-97e1-cd2867ca2962 tempest-SnapshotDataIntegrityTests-268465305 tempest-SnapshotDataIntegrityTests-268465305-project-member] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.878s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received event network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Acquiring lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] No waiting events found dispatching network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:59:53 user nova-compute[71283]: WARNING nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received unexpected event network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 for instance with vm_state deleted and task_state None. Apr 20 10:59:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received event network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Acquiring lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] No waiting events found dispatching network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:59:53 user nova-compute[71283]: WARNING nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received unexpected event network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 for instance with vm_state deleted and task_state None. Apr 20 10:59:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received event network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Acquiring lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] No waiting events found dispatching network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:59:53 user nova-compute[71283]: WARNING nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received unexpected event network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 for instance with vm_state deleted and task_state None. Apr 20 10:59:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received event network-vif-unplugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Acquiring lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] No waiting events found dispatching network-vif-unplugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:59:53 user nova-compute[71283]: WARNING nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received unexpected event network-vif-unplugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 for instance with vm_state deleted and task_state None. Apr 20 10:59:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received event network-vif-deleted-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received event network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Acquiring lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] Lock "2874b7cd-8c61-46d5-a949-8a8c2469b9af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 10:59:53 user nova-compute[71283]: DEBUG nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] No waiting events found dispatching network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 {{(pid=71283) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 10:59:53 user nova-compute[71283]: WARNING nova.compute.manager [req-dff7ca3a-985a-4572-a702-37c56f23877c req-94ae1215-cea3-4c14-b5cb-4bf4699d334b service nova] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Received unexpected event network-vif-plugged-7a0b25cb-ada6-4f3c-bdd7-a9c65d503530 for instance with vm_state deleted and task_state None. Apr 20 10:59:55 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 10:59:55 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Cleaning up deleted instances with incomplete migration {{(pid=71283) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 20 10:59:56 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:00:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 11:00:06 user nova-compute[71283]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71283) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 11:00:06 user nova-compute[71283]: INFO nova.compute.manager [-] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] VM Stopped (Lifecycle Event) Apr 20 11:00:06 user nova-compute[71283]: DEBUG nova.compute.manager [None req-ed87f5a5-a349-4e75-9796-b029e06185f9 None None] [instance: 2874b7cd-8c61-46d5-a949-8a8c2469b9af] Checking state {{(pid=71283) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 11:00:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:00:11 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 11:00:16 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 11:00:21 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 11:00:26 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:00:31 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:00:36 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 11:00:37 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 11:00:38 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 11:00:39 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 11:00:40 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 11:00:40 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71283) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 11:00:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:00:41 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager.update_available_resource {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 11:00:41 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:00:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 11:00:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 11:00:41 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 11:00:41 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Auditing locally available compute resources for user (node: user) {{(pid=71283) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 11:00:42 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 11:00:42 user nova-compute[71283]: WARNING nova.virt.libvirt.driver [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 11:00:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Hypervisor/Node resource view: name=user free_ram=9250MB free_disk=26.489673614501953GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}] {{(pid=71283) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 11:00:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 11:00:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 11:00:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 11:00:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71283) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 11:00:42 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing inventories for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 20 11:00:42 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating ProviderTree inventory for provider bdbc83bd-9307-4e20-8e3d-430b77499399 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 20 11:00:42 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Updating inventory in ProviderTree for provider bdbc83bd-9307-4e20-8e3d-430b77499399 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 11:00:42 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing aggregate associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, aggregates: None {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 20 11:00:42 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Refreshing trait associations for resource provider bdbc83bd-9307-4e20-8e3d-430b77499399, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE {{(pid=71283) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 20 11:00:42 user nova-compute[71283]: DEBUG nova.compute.provider_tree [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed in ProviderTree for provider: bdbc83bd-9307-4e20-8e3d-430b77499399 {{(pid=71283) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 11:00:42 user nova-compute[71283]: DEBUG nova.scheduler.client.report [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Inventory has not changed for provider bdbc83bd-9307-4e20-8e3d-430b77499399 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71283) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 11:00:42 user nova-compute[71283]: DEBUG nova.compute.resource_tracker [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Compute_service record updated for user:user {{(pid=71283) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 11:00:42 user nova-compute[71283]: DEBUG oslo_concurrency.lockutils [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.396s {{(pid=71283) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 11:00:43 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 11:00:43 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:00:44 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 11:00:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:00:46 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 11:00:46 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Starting heal instance info cache {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 11:00:46 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Rebuilding the list of instances to heal {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 11:00:46 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:00:46 user nova-compute[71283]: DEBUG nova.compute.manager [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Didn't find any instances for network info cache update. {{(pid=71283) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 11:00:46 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 11:00:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 11:00:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:00:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 11:00:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 11:00:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 11:00:51 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:00:55 user nova-compute[71283]: DEBUG oslo_service.periodic_task [None req-08935a5a-6089-434e-ba0a-ae4e16d6934a None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71283) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 11:00:56 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:01:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 11:01:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:01:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71283) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 11:01:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 11:01:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71283) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 11:01:01 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 11:01:06 user nova-compute[71283]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71283) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}}