Apr 24 00:07:04 user nova-compute[71205]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. Apr 24 00:07:07 user nova-compute[71205]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=71205) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=71205) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=71205) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 24 00:07:07 user nova-compute[71205]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.018s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:07:07 user nova-compute[71205]: INFO nova.virt.driver [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] Loading compute driver 'libvirt.LibvirtDriver' Apr 24 00:07:07 user nova-compute[71205]: INFO nova.compute.provider_config [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] Acquiring lock "singleton_lock" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] Acquired lock "singleton_lock" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] Releasing lock "singleton_lock" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] Full set of CONF: {{(pid=71205) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ******************************************************************************** {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] Configuration options gathered from: {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] command line args: ['--config-file', '/etc/nova/nova-cpu.conf'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] config files: ['/etc/nova/nova-cpu.conf'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ================================================================================ {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] allow_resize_to_same_host = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] arq_binding_timeout = 300 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] backdoor_port = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] backdoor_socket = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] block_device_allocate_retries = 300 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] block_device_allocate_retries_interval = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cert = self.pem {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute_driver = libvirt.LibvirtDriver {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute_monitors = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] config_dir = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] config_drive_format = iso9660 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] config_file = ['/etc/nova/nova-cpu.conf'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] config_source = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] console_host = user {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] control_exchange = nova {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cpu_allocation_ratio = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] daemon = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] debug = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] default_access_ip_network_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] default_availability_zone = nova {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] default_ephemeral_format = ext4 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] default_schedule_zone = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] disk_allocation_ratio = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] enable_new_services = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] enabled_apis = ['osapi_compute'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] enabled_ssl_apis = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] flat_injected = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] force_config_drive = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] force_raw_images = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] graceful_shutdown_timeout = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] heal_instance_info_cache_interval = 60 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] host = user {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] initial_disk_allocation_ratio = 1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] initial_ram_allocation_ratio = 1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] instance_build_timeout = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] instance_delete_interval = 300 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] instance_format = [instance: %(uuid)s] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] instance_name_template = instance-%08x {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] instance_usage_audit = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] instance_usage_audit_period = month {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] instances_path = /opt/stack/data/nova/instances {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] internal_service_availability_zone = internal {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] key = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] live_migration_retry_count = 30 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] log_config_append = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] log_dir = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] log_file = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] log_options = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] log_rotate_interval = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] log_rotate_interval_type = days {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] log_rotation_type = none {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] long_rpc_timeout = 1800 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] max_concurrent_builds = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] max_concurrent_live_migrations = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] max_concurrent_snapshots = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] max_local_block_devices = 3 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] max_logfile_count = 30 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] max_logfile_size_mb = 200 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] maximum_instance_delete_attempts = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] metadata_listen = 0.0.0.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] metadata_listen_port = 8775 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] metadata_workers = 3 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] migrate_max_retries = -1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] mkisofs_cmd = genisoimage {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] my_block_storage_ip = 10.0.0.210 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] my_ip = 10.0.0.210 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] network_allocate_retries = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] osapi_compute_listen = 0.0.0.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] osapi_compute_listen_port = 8774 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] osapi_compute_unique_server_name_scope = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] osapi_compute_workers = 3 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] password_length = 12 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] periodic_enable = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] periodic_fuzzy_delay = 60 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] pointer_model = ps2mouse {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] preallocate_images = none {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] publish_errors = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] pybasedir = /opt/stack/nova {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ram_allocation_ratio = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] rate_limit_burst = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] rate_limit_except_level = CRITICAL {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] rate_limit_interval = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] reboot_timeout = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] reclaim_instance_interval = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] record = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] reimage_timeout_per_gb = 20 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] report_interval = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] rescue_timeout = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] reserved_host_cpus = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] reserved_host_disk_mb = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] reserved_host_memory_mb = 512 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] reserved_huge_pages = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] resize_confirm_window = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] resize_fs_using_block_device = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] resume_guests_state_on_host_boot = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] rpc_response_timeout = 60 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] run_external_periodic_tasks = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] running_deleted_instance_action = reap {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] running_deleted_instance_poll_interval = 1800 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] running_deleted_instance_timeout = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] scheduler_instance_sync_interval = 120 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] service_down_time = 60 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] servicegroup_driver = db {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] shelved_offload_time = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] shelved_poll_interval = 3600 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] shutdown_timeout = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] source_is_ipv6 = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ssl_only = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] state_path = /opt/stack/data/nova {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] sync_power_state_interval = 600 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] sync_power_state_pool_size = 1000 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] syslog_log_facility = LOG_USER {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] tempdir = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] timeout_nbd = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] transport_url = **** {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] update_resources_interval = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] use_cow_images = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] use_eventlog = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] use_journal = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] use_json = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] use_rootwrap_daemon = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] use_stderr = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] use_syslog = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vcpu_pin_set = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plugging_is_fatal = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plugging_timeout = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] virt_mkfs = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] volume_usage_poll_interval = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] watch_log_file = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] web = /usr/share/spice-html5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_concurrency.disable_process_locking = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.auth_strategy = keystone {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.compute_link_prefix = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.dhcp_domain = novalocal {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.enable_instance_password = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.glance_link_prefix = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.instance_list_per_project_cells = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.list_records_by_skipping_down_cells = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.local_metadata_per_cell = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.max_limit = 1000 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.metadata_cache_expiration = 15 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.neutron_default_tenant_id = default {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.use_forwarded_for = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.use_neutron_default_nets = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.vendordata_dynamic_targets = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.vendordata_jsonfile_path = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.backend = dogpile.cache.memcached {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.backend_argument = **** {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.config_prefix = cache.oslo {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.dead_timeout = 60.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.debug_cache_backend = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.enable_retry_client = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.enable_socket_keepalive = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.enabled = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.expiration_time = 600 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.hashclient_retry_attempts = 2 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.hashclient_retry_delay = 1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.memcache_dead_retry = 300 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.memcache_password = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.memcache_pool_maxsize = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.memcache_sasl_enabled = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.memcache_socket_timeout = 1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.memcache_username = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.proxies = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.retry_attempts = 2 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.retry_delay = 0.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.socket_keepalive_count = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.socket_keepalive_idle = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.socket_keepalive_interval = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.tls_allowed_ciphers = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.tls_cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.tls_certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.tls_enabled = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cache.tls_keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.auth_section = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.auth_type = password {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.catalog_info = volumev3::publicURL {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.collect_timing = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.cross_az_attach = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.debug = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.endpoint_template = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.http_retries = 3 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.os_region_name = RegionOne {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.split_loggers = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cinder.timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute.cpu_dedicated_set = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute.cpu_shared_set = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute.image_type_exclude_list = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute.max_concurrent_disk_ops = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute.max_disk_devices_to_attach = -1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute.resource_provider_association_refresh = 300 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute.shutdown_retry_interval = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] conductor.workers = 3 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] console.allowed_origins = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] console.ssl_ciphers = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] console.ssl_minimum_version = default {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] consoleauth.token_ttl = 600 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.collect_timing = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.connect_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.connect_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.endpoint_override = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.max_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.min_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.region_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.service_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.service_type = accelerator {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.split_loggers = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.status_code_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.status_code_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] cyborg.version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.backend = sqlalchemy {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.connection = **** {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.connection_debug = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.connection_parameters = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.connection_recycle_time = 3600 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.connection_trace = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.db_inc_retry_interval = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.db_max_retries = 20 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.db_max_retry_interval = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.db_retry_interval = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.max_overflow = 50 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.max_pool_size = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.max_retries = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.mysql_enable_ndb = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.mysql_wsrep_sync_wait = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.pool_timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.retry_interval = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.slave_connection = **** {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] database.sqlite_synchronous = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.backend = sqlalchemy {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.connection = **** {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.connection_debug = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.connection_parameters = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.connection_recycle_time = 3600 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.connection_trace = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.db_inc_retry_interval = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.db_max_retries = 20 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.db_max_retry_interval = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.db_retry_interval = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.max_overflow = 50 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.max_pool_size = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.max_retries = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.mysql_enable_ndb = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.pool_timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.retry_interval = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.slave_connection = **** {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] api_database.sqlite_synchronous = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] devices.enabled_mdev_types = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ephemeral_storage_encryption.enabled = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.api_servers = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.collect_timing = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.connect_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.connect_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.debug = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.default_trusted_certificate_ids = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.enable_certificate_validation = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.enable_rbd_download = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.endpoint_override = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.max_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.min_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.num_retries = 3 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.rbd_ceph_conf = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.rbd_connect_timeout = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.rbd_pool = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.rbd_user = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.region_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.service_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.service_type = image {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.split_loggers = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.status_code_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.status_code_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.verify_glance_signatures = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] glance.version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] guestfs.debug = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.config_drive_cdrom = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.config_drive_inject_password = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.enable_instance_metrics_collection = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.enable_remotefx = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.instances_path_share = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.iscsi_initiator_list = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.limit_cpu_features = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.power_state_check_timeframe = 60 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.use_multipath_io = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.volume_attach_retry_count = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.vswitch_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] mks.enabled = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] image_cache.manager_interval = 2400 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] image_cache.precache_concurrency = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] image_cache.remove_unused_base_images = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] image_cache.subdirectory_name = _base {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.api_max_retries = 60 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.api_retry_interval = 2 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.auth_section = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.auth_type = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.collect_timing = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.connect_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.connect_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.endpoint_override = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.max_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.min_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.partition_key = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.peer_list = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.region_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.serial_console_state_timeout = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.service_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.service_type = baremetal {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.split_loggers = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.status_code_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.status_code_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ironic.version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] key_manager.fixed_key = **** {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.barbican_api_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.barbican_endpoint = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.barbican_endpoint_type = public {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.barbican_region_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.collect_timing = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.number_of_retries = 60 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.retry_delay = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.send_service_user_token = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.split_loggers = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.verify_ssl = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican.verify_ssl_path = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican_service_user.auth_section = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican_service_user.auth_type = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican_service_user.cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican_service_user.certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican_service_user.collect_timing = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican_service_user.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican_service_user.keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican_service_user.split_loggers = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] barbican_service_user.timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.approle_role_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.approle_secret_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.collect_timing = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.kv_mountpoint = secret {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.kv_version = 2 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.namespace = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.root_token_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.split_loggers = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.ssl_ca_crt_file = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.use_ssl = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.collect_timing = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.connect_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.connect_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.endpoint_override = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.max_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.min_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.region_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.service_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.service_type = identity {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.split_loggers = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.status_code_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.status_code_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] keystone.version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.connection_uri = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.cpu_mode = custom {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.cpu_model_extra_flags = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: WARNING oslo_config.cfg [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] Deprecated: Option "cpu_model" from group "libvirt" is deprecated. Use option "cpu_models" from group "libvirt". Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.cpu_models = ['Nehalem'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.cpu_power_governor_high = performance {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.cpu_power_governor_low = powersave {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.cpu_power_management = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.device_detach_attempts = 8 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.device_detach_timeout = 20 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.disk_cachemodes = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.disk_prefix = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.enabled_perf_events = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.file_backed_memory = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.gid_maps = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.hw_disk_discard = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.hw_machine_type = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.images_rbd_ceph_conf = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.images_rbd_glance_store_name = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.images_rbd_pool = rbd {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.images_type = default {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.images_volume_group = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.inject_key = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.inject_partition = -2 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.inject_password = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.iscsi_iface = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.iser_use_multipath = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.live_migration_bandwidth = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.live_migration_downtime = 500 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.live_migration_inbound_addr = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.live_migration_permit_post_copy = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.live_migration_scheme = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.live_migration_timeout_action = abort {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.live_migration_tunnelled = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: WARNING oslo_config.cfg [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Apr 24 00:07:07 user nova-compute[71205]: live_migration_uri is deprecated for removal in favor of two other options that Apr 24 00:07:07 user nova-compute[71205]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Apr 24 00:07:07 user nova-compute[71205]: and ``live_migration_inbound_addr`` respectively. Apr 24 00:07:07 user nova-compute[71205]: ). Its value may be silently ignored in the future. Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.live_migration_uri = qemu+ssh://stack@%s/system {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.live_migration_with_native_tls = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.max_queues = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.nfs_mount_options = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.num_iser_scan_tries = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.num_memory_encrypted_guests = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.num_pcie_ports = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.num_volume_scan_tries = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.pmem_namespaces = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.quobyte_client_cfg = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.rbd_connect_timeout = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.rbd_secret_uuid = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.rbd_user = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.remote_filesystem_transport = ssh {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.rescue_image_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.rescue_kernel_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.rescue_ramdisk_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.rx_queue_size = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.smbfs_mount_options = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.snapshot_compression = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.snapshot_image_format = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.sparse_logical_volumes = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.swtpm_enabled = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.swtpm_group = tss {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.swtpm_user = tss {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.sysinfo_serial = unique {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.tx_queue_size = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.uid_maps = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.use_virtio_for_bridges = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.virt_type = kvm {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.volume_clear = zero {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.volume_clear_size = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.volume_use_multipath = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.vzstorage_cache_path = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.vzstorage_mount_group = qemu {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.vzstorage_mount_opts = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.vzstorage_mount_user = stack {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.auth_section = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.auth_type = password {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.collect_timing = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.connect_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.connect_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.default_floating_pool = public {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.endpoint_override = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.extension_sync_interval = 600 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.http_retries = 3 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.max_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.min_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.ovs_bridge = br-int {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.physnets = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.region_name = RegionOne {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.service_metadata_proxy = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.service_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.service_type = network {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.split_loggers = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.status_code_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.status_code_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] neutron.version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] notifications.bdms_in_notifications = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] notifications.default_level = INFO {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] notifications.notification_format = unversioned {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] notifications.notify_on_state_change = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] pci.alias = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] pci.device_spec = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] pci.report_in_placement = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.auth_section = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.auth_type = password {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.auth_url = http://10.0.0.210/identity {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.collect_timing = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.connect_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.connect_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.default_domain_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.default_domain_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.domain_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.domain_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.endpoint_override = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.max_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.min_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.password = **** {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.project_domain_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.project_domain_name = Default {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.project_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.project_name = service {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.region_name = RegionOne {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.service_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.service_type = placement {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.split_loggers = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.status_code_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.status_code_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.system_scope = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.trust_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.user_domain_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.user_domain_name = Default {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.user_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.username = placement {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] placement.version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] quota.cores = 20 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] quota.count_usage_from_placement = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] quota.injected_file_content_bytes = 10240 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] quota.injected_file_path_length = 255 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] quota.injected_files = 5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] quota.instances = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] quota.key_pairs = 100 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] quota.metadata_items = 128 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] quota.ram = 51200 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] quota.recheck_quota = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] quota.server_group_members = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] quota.server_groups = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] rdp.enabled = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] scheduler.image_metadata_prefilter = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] scheduler.max_attempts = 3 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] scheduler.max_placement_results = 1000 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] scheduler.query_placement_for_availability_zone = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] scheduler.query_placement_for_image_type_support = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] scheduler.workers = 3 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.host_subset_size = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.isolated_hosts = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.isolated_images = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.pci_in_placement = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.track_instance_changes = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] metrics.required = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] metrics.weight_multiplier = 1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] metrics.weight_setting = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] serial_console.enabled = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] serial_console.port_range = 10000:20000 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] serial_console.serialproxy_port = 6083 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] service_user.auth_section = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] service_user.auth_type = password {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] service_user.cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] service_user.certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] service_user.collect_timing = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] service_user.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] service_user.keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] service_user.send_service_user_token = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] service_user.split_loggers = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] service_user.timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] spice.agent_enabled = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] spice.enabled = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] spice.html5proxy_base_url = http://10.0.0.210:6081/spice_auto.html {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] spice.html5proxy_port = 6082 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] spice.image_compression = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] spice.jpeg_compression = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] spice.playback_compression = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] spice.server_listen = 127.0.0.1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] spice.streaming_mode = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] spice.zlib_compression = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] upgrade_levels.baseapi = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] upgrade_levels.cert = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] upgrade_levels.compute = auto {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] upgrade_levels.conductor = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] upgrade_levels.scheduler = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vendordata_dynamic_auth.auth_section = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vendordata_dynamic_auth.auth_type = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vendordata_dynamic_auth.cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vendordata_dynamic_auth.certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vendordata_dynamic_auth.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vendordata_dynamic_auth.keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vendordata_dynamic_auth.timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.api_retry_count = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.ca_file = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.cache_prefix = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.cluster_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.connection_pool_size = 10 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.console_delay_seconds = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.datastore_regex = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.host_ip = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.host_password = **** {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.host_port = 443 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.host_username = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.integration_bridge = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.maximum_objects = 100 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.pbm_default_policy = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.pbm_enabled = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.pbm_wsdl_location = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.serial_port_proxy_uri = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.serial_port_service_uri = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.task_poll_interval = 0.5 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.use_linked_clone = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.vnc_keymap = en-us {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.vnc_port = 5900 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vmware.vnc_port_total = 10000 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vnc.auth_schemes = ['none'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vnc.enabled = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vnc.novncproxy_base_url = http://10.0.0.210:6080/vnc_lite.html {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vnc.novncproxy_port = 6080 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vnc.server_listen = 0.0.0.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vnc.server_proxyclient_address = 10.0.0.210 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vnc.vencrypt_ca_certs = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vnc.vencrypt_client_cert = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vnc.vencrypt_client_key = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.disable_group_policy_check_upcall = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.disable_rootwrap = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.enable_numa_live_migration = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.libvirt_disable_apic = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] wsgi.client_socket_timeout = 900 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] wsgi.default_pool_size = 1000 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] wsgi.keep_alive = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] wsgi.max_header_line = 16384 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] wsgi.secure_proxy_ssl_header = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] wsgi.ssl_ca_file = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] wsgi.ssl_cert_file = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] wsgi.ssl_key_file = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] wsgi.tcp_keepidle = 600 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] zvm.ca_file = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] zvm.cloud_connector_url = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] zvm.reachable_timeout = 300 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_policy.enforce_new_defaults = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_policy.enforce_scope = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_policy.policy_default_rule = default {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_policy.policy_file = policy.yaml {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] profiler.connection_string = messaging:// {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] profiler.enabled = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] profiler.es_doc_type = notification {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] profiler.es_scroll_size = 10000 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] profiler.es_scroll_time = 2m {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] profiler.filter_error_trace = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] profiler.hmac_keys = SECRET_KEY {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] profiler.sentinel_service_name = mymaster {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] profiler.socket_timeout = 0.1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] profiler.trace_sqlalchemy = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] remote_debug.host = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] remote_debug.port = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_bytes = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_length = 0 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.ssl = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_rabbit.ssl_version = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_notifications.retry = -1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_messaging_notifications.transport_url = **** {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.auth_section = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.auth_type = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.cafile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.certfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.collect_timing = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.connect_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.connect_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.endpoint_id = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.endpoint_override = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.insecure = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.keyfile = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.max_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.min_version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.region_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.service_name = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.service_type = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.split_loggers = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.status_code_retries = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.status_code_retry_delay = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.timeout = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.valid_interfaces = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_limit.version = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_reports.file_event_handler = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] oslo_reports.log_dir = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 12 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plug_ovs_privileged.group = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plug_ovs_privileged.thread_pool_size = 12 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] vif_plug_ovs_privileged.user = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_linux_bridge.flat_interface = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_ovs.isolate_vif = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_ovs.ovsdb_interface = native {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_vif_ovs.per_port_bridge = False {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] os_brick.lock_path = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] privsep_osbrick.capabilities = [21] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] privsep_osbrick.group = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] privsep_osbrick.helper_command = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] privsep_osbrick.thread_pool_size = 12 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] privsep_osbrick.user = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] nova_sys_admin.group = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] nova_sys_admin.helper_command = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] nova_sys_admin.thread_pool_size = 12 {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] nova_sys_admin.user = None {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG oslo_service.service [None req-bcfc27bd-da99-42ed-8e04-bd4c1b0f25d6 None None] ******************************************************************************** {{(pid=71205) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} Apr 24 00:07:07 user nova-compute[71205]: INFO nova.service [-] Starting compute node (version 0.0.0) Apr 24 00:07:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Starting native event thread {{(pid=71205) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:492}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Starting green dispatch thread {{(pid=71205) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:498}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Starting connection event dispatch thread {{(pid=71205) initialize /opt/stack/nova/nova/virt/libvirt/host.py:620}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Connecting to libvirt: qemu:///system {{(pid=71205) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:503}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Registering for lifecycle events {{(pid=71205) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:509}} Apr 24 00:07:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Registering for connection events: {{(pid=71205) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:530}} Apr 24 00:07:07 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Connection event '1' reason 'None' Apr 24 00:07:07 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Cannot update service status on host "user" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 24 00:07:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.volume.mount [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Initialising _HostMountState generation 0 {{(pid=71205) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Apr 24 00:07:15 user nova-compute[71205]: INFO nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Libvirt host capabilities Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: e20c3142-5af9-7467-ecd8-70b2e4a210d6 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: x86_64 Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: Intel Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: tcp Apr 24 00:07:15 user nova-compute[71205]: rdma Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 8152920 Apr 24 00:07:15 user nova-compute[71205]: 2038230 Apr 24 00:07:15 user nova-compute[71205]: 0 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 8255068 Apr 24 00:07:15 user nova-compute[71205]: 2063767 Apr 24 00:07:15 user nova-compute[71205]: 0 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: apparmor Apr 24 00:07:15 user nova-compute[71205]: 0 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: dac Apr 24 00:07:15 user nova-compute[71205]: 0 Apr 24 00:07:15 user nova-compute[71205]: +64055:+108 Apr 24 00:07:15 user nova-compute[71205]: +64055:+108 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 64 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-alpha Apr 24 00:07:15 user nova-compute[71205]: clipper Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-arm Apr 24 00:07:15 user nova-compute[71205]: integratorcp Apr 24 00:07:15 user nova-compute[71205]: ast2600-evb Apr 24 00:07:15 user nova-compute[71205]: borzoi Apr 24 00:07:15 user nova-compute[71205]: spitz Apr 24 00:07:15 user nova-compute[71205]: virt-2.7 Apr 24 00:07:15 user nova-compute[71205]: nuri Apr 24 00:07:15 user nova-compute[71205]: mcimx7d-sabre Apr 24 00:07:15 user nova-compute[71205]: romulus-bmc Apr 24 00:07:15 user nova-compute[71205]: virt-3.0 Apr 24 00:07:15 user nova-compute[71205]: virt-5.0 Apr 24 00:07:15 user nova-compute[71205]: npcm750-evb Apr 24 00:07:15 user nova-compute[71205]: virt-2.10 Apr 24 00:07:15 user nova-compute[71205]: rainier-bmc Apr 24 00:07:15 user nova-compute[71205]: mps3-an547 Apr 24 00:07:15 user nova-compute[71205]: musca-b1 Apr 24 00:07:15 user nova-compute[71205]: realview-pbx-a9 Apr 24 00:07:15 user nova-compute[71205]: versatileab Apr 24 00:07:15 user nova-compute[71205]: kzm Apr 24 00:07:15 user nova-compute[71205]: virt-2.8 Apr 24 00:07:15 user nova-compute[71205]: musca-a Apr 24 00:07:15 user nova-compute[71205]: virt-3.1 Apr 24 00:07:15 user nova-compute[71205]: mcimx6ul-evk Apr 24 00:07:15 user nova-compute[71205]: virt-5.1 Apr 24 00:07:15 user nova-compute[71205]: smdkc210 Apr 24 00:07:15 user nova-compute[71205]: sx1 Apr 24 00:07:15 user nova-compute[71205]: virt-2.11 Apr 24 00:07:15 user nova-compute[71205]: imx25-pdk Apr 24 00:07:15 user nova-compute[71205]: stm32vldiscovery Apr 24 00:07:15 user nova-compute[71205]: virt-2.9 Apr 24 00:07:15 user nova-compute[71205]: orangepi-pc Apr 24 00:07:15 user nova-compute[71205]: quanta-q71l-bmc Apr 24 00:07:15 user nova-compute[71205]: z2 Apr 24 00:07:15 user nova-compute[71205]: virt-5.2 Apr 24 00:07:15 user nova-compute[71205]: xilinx-zynq-a9 Apr 24 00:07:15 user nova-compute[71205]: tosa Apr 24 00:07:15 user nova-compute[71205]: mps2-an500 Apr 24 00:07:15 user nova-compute[71205]: virt-2.12 Apr 24 00:07:15 user nova-compute[71205]: mps2-an521 Apr 24 00:07:15 user nova-compute[71205]: sabrelite Apr 24 00:07:15 user nova-compute[71205]: mps2-an511 Apr 24 00:07:15 user nova-compute[71205]: canon-a1100 Apr 24 00:07:15 user nova-compute[71205]: realview-eb Apr 24 00:07:15 user nova-compute[71205]: quanta-gbs-bmc Apr 24 00:07:15 user nova-compute[71205]: emcraft-sf2 Apr 24 00:07:15 user nova-compute[71205]: realview-pb-a8 Apr 24 00:07:15 user nova-compute[71205]: virt-4.0 Apr 24 00:07:15 user nova-compute[71205]: raspi1ap Apr 24 00:07:15 user nova-compute[71205]: palmetto-bmc Apr 24 00:07:15 user nova-compute[71205]: sx1-v1 Apr 24 00:07:15 user nova-compute[71205]: n810 Apr 24 00:07:15 user nova-compute[71205]: g220a-bmc Apr 24 00:07:15 user nova-compute[71205]: n800 Apr 24 00:07:15 user nova-compute[71205]: tacoma-bmc Apr 24 00:07:15 user nova-compute[71205]: virt-4.1 Apr 24 00:07:15 user nova-compute[71205]: quanta-gsj Apr 24 00:07:15 user nova-compute[71205]: versatilepb Apr 24 00:07:15 user nova-compute[71205]: terrier Apr 24 00:07:15 user nova-compute[71205]: mainstone Apr 24 00:07:15 user nova-compute[71205]: realview-eb-mpcore Apr 24 00:07:15 user nova-compute[71205]: supermicrox11-bmc Apr 24 00:07:15 user nova-compute[71205]: virt-4.2 Apr 24 00:07:15 user nova-compute[71205]: witherspoon-bmc Apr 24 00:07:15 user nova-compute[71205]: mps3-an524 Apr 24 00:07:15 user nova-compute[71205]: swift-bmc Apr 24 00:07:15 user nova-compute[71205]: kudo-bmc Apr 24 00:07:15 user nova-compute[71205]: vexpress-a9 Apr 24 00:07:15 user nova-compute[71205]: midway Apr 24 00:07:15 user nova-compute[71205]: musicpal Apr 24 00:07:15 user nova-compute[71205]: lm3s811evb Apr 24 00:07:15 user nova-compute[71205]: lm3s6965evb Apr 24 00:07:15 user nova-compute[71205]: microbit Apr 24 00:07:15 user nova-compute[71205]: mps2-an505 Apr 24 00:07:15 user nova-compute[71205]: mps2-an385 Apr 24 00:07:15 user nova-compute[71205]: virt-6.0 Apr 24 00:07:15 user nova-compute[71205]: cubieboard Apr 24 00:07:15 user nova-compute[71205]: verdex Apr 24 00:07:15 user nova-compute[71205]: netduino2 Apr 24 00:07:15 user nova-compute[71205]: mps2-an386 Apr 24 00:07:15 user nova-compute[71205]: virt-6.1 Apr 24 00:07:15 user nova-compute[71205]: raspi2b Apr 24 00:07:15 user nova-compute[71205]: vexpress-a15 Apr 24 00:07:15 user nova-compute[71205]: fuji-bmc Apr 24 00:07:15 user nova-compute[71205]: virt-6.2 Apr 24 00:07:15 user nova-compute[71205]: virt Apr 24 00:07:15 user nova-compute[71205]: sonorapass-bmc Apr 24 00:07:15 user nova-compute[71205]: cheetah Apr 24 00:07:15 user nova-compute[71205]: virt-2.6 Apr 24 00:07:15 user nova-compute[71205]: ast2500-evb Apr 24 00:07:15 user nova-compute[71205]: highbank Apr 24 00:07:15 user nova-compute[71205]: akita Apr 24 00:07:15 user nova-compute[71205]: connex Apr 24 00:07:15 user nova-compute[71205]: netduinoplus2 Apr 24 00:07:15 user nova-compute[71205]: collie Apr 24 00:07:15 user nova-compute[71205]: raspi0 Apr 24 00:07:15 user nova-compute[71205]: fp5280g2-bmc Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-arm Apr 24 00:07:15 user nova-compute[71205]: integratorcp Apr 24 00:07:15 user nova-compute[71205]: ast2600-evb Apr 24 00:07:15 user nova-compute[71205]: borzoi Apr 24 00:07:15 user nova-compute[71205]: spitz Apr 24 00:07:15 user nova-compute[71205]: virt-2.7 Apr 24 00:07:15 user nova-compute[71205]: nuri Apr 24 00:07:15 user nova-compute[71205]: mcimx7d-sabre Apr 24 00:07:15 user nova-compute[71205]: romulus-bmc Apr 24 00:07:15 user nova-compute[71205]: virt-3.0 Apr 24 00:07:15 user nova-compute[71205]: virt-5.0 Apr 24 00:07:15 user nova-compute[71205]: npcm750-evb Apr 24 00:07:15 user nova-compute[71205]: virt-2.10 Apr 24 00:07:15 user nova-compute[71205]: rainier-bmc Apr 24 00:07:15 user nova-compute[71205]: mps3-an547 Apr 24 00:07:15 user nova-compute[71205]: musca-b1 Apr 24 00:07:15 user nova-compute[71205]: realview-pbx-a9 Apr 24 00:07:15 user nova-compute[71205]: versatileab Apr 24 00:07:15 user nova-compute[71205]: kzm Apr 24 00:07:15 user nova-compute[71205]: virt-2.8 Apr 24 00:07:15 user nova-compute[71205]: musca-a Apr 24 00:07:15 user nova-compute[71205]: virt-3.1 Apr 24 00:07:15 user nova-compute[71205]: mcimx6ul-evk Apr 24 00:07:15 user nova-compute[71205]: virt-5.1 Apr 24 00:07:15 user nova-compute[71205]: smdkc210 Apr 24 00:07:15 user nova-compute[71205]: sx1 Apr 24 00:07:15 user nova-compute[71205]: virt-2.11 Apr 24 00:07:15 user nova-compute[71205]: imx25-pdk Apr 24 00:07:15 user nova-compute[71205]: stm32vldiscovery Apr 24 00:07:15 user nova-compute[71205]: virt-2.9 Apr 24 00:07:15 user nova-compute[71205]: orangepi-pc Apr 24 00:07:15 user nova-compute[71205]: quanta-q71l-bmc Apr 24 00:07:15 user nova-compute[71205]: z2 Apr 24 00:07:15 user nova-compute[71205]: virt-5.2 Apr 24 00:07:15 user nova-compute[71205]: xilinx-zynq-a9 Apr 24 00:07:15 user nova-compute[71205]: tosa Apr 24 00:07:15 user nova-compute[71205]: mps2-an500 Apr 24 00:07:15 user nova-compute[71205]: virt-2.12 Apr 24 00:07:15 user nova-compute[71205]: mps2-an521 Apr 24 00:07:15 user nova-compute[71205]: sabrelite Apr 24 00:07:15 user nova-compute[71205]: mps2-an511 Apr 24 00:07:15 user nova-compute[71205]: canon-a1100 Apr 24 00:07:15 user nova-compute[71205]: realview-eb Apr 24 00:07:15 user nova-compute[71205]: quanta-gbs-bmc Apr 24 00:07:15 user nova-compute[71205]: emcraft-sf2 Apr 24 00:07:15 user nova-compute[71205]: realview-pb-a8 Apr 24 00:07:15 user nova-compute[71205]: virt-4.0 Apr 24 00:07:15 user nova-compute[71205]: raspi1ap Apr 24 00:07:15 user nova-compute[71205]: palmetto-bmc Apr 24 00:07:15 user nova-compute[71205]: sx1-v1 Apr 24 00:07:15 user nova-compute[71205]: n810 Apr 24 00:07:15 user nova-compute[71205]: g220a-bmc Apr 24 00:07:15 user nova-compute[71205]: n800 Apr 24 00:07:15 user nova-compute[71205]: tacoma-bmc Apr 24 00:07:15 user nova-compute[71205]: virt-4.1 Apr 24 00:07:15 user nova-compute[71205]: quanta-gsj Apr 24 00:07:15 user nova-compute[71205]: versatilepb Apr 24 00:07:15 user nova-compute[71205]: terrier Apr 24 00:07:15 user nova-compute[71205]: mainstone Apr 24 00:07:15 user nova-compute[71205]: realview-eb-mpcore Apr 24 00:07:15 user nova-compute[71205]: supermicrox11-bmc Apr 24 00:07:15 user nova-compute[71205]: virt-4.2 Apr 24 00:07:15 user nova-compute[71205]: witherspoon-bmc Apr 24 00:07:15 user nova-compute[71205]: mps3-an524 Apr 24 00:07:15 user nova-compute[71205]: swift-bmc Apr 24 00:07:15 user nova-compute[71205]: kudo-bmc Apr 24 00:07:15 user nova-compute[71205]: vexpress-a9 Apr 24 00:07:15 user nova-compute[71205]: midway Apr 24 00:07:15 user nova-compute[71205]: musicpal Apr 24 00:07:15 user nova-compute[71205]: lm3s811evb Apr 24 00:07:15 user nova-compute[71205]: lm3s6965evb Apr 24 00:07:15 user nova-compute[71205]: microbit Apr 24 00:07:15 user nova-compute[71205]: mps2-an505 Apr 24 00:07:15 user nova-compute[71205]: mps2-an385 Apr 24 00:07:15 user nova-compute[71205]: virt-6.0 Apr 24 00:07:15 user nova-compute[71205]: cubieboard Apr 24 00:07:15 user nova-compute[71205]: verdex Apr 24 00:07:15 user nova-compute[71205]: netduino2 Apr 24 00:07:15 user nova-compute[71205]: mps2-an386 Apr 24 00:07:15 user nova-compute[71205]: virt-6.1 Apr 24 00:07:15 user nova-compute[71205]: raspi2b Apr 24 00:07:15 user nova-compute[71205]: vexpress-a15 Apr 24 00:07:15 user nova-compute[71205]: fuji-bmc Apr 24 00:07:15 user nova-compute[71205]: virt-6.2 Apr 24 00:07:15 user nova-compute[71205]: virt Apr 24 00:07:15 user nova-compute[71205]: sonorapass-bmc Apr 24 00:07:15 user nova-compute[71205]: cheetah Apr 24 00:07:15 user nova-compute[71205]: virt-2.6 Apr 24 00:07:15 user nova-compute[71205]: ast2500-evb Apr 24 00:07:15 user nova-compute[71205]: highbank Apr 24 00:07:15 user nova-compute[71205]: akita Apr 24 00:07:15 user nova-compute[71205]: connex Apr 24 00:07:15 user nova-compute[71205]: netduinoplus2 Apr 24 00:07:15 user nova-compute[71205]: collie Apr 24 00:07:15 user nova-compute[71205]: raspi0 Apr 24 00:07:15 user nova-compute[71205]: fp5280g2-bmc Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 64 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-aarch64 Apr 24 00:07:15 user nova-compute[71205]: integratorcp Apr 24 00:07:15 user nova-compute[71205]: ast2600-evb Apr 24 00:07:15 user nova-compute[71205]: borzoi Apr 24 00:07:15 user nova-compute[71205]: spitz Apr 24 00:07:15 user nova-compute[71205]: virt-2.7 Apr 24 00:07:15 user nova-compute[71205]: nuri Apr 24 00:07:15 user nova-compute[71205]: mcimx7d-sabre Apr 24 00:07:15 user nova-compute[71205]: romulus-bmc Apr 24 00:07:15 user nova-compute[71205]: virt-3.0 Apr 24 00:07:15 user nova-compute[71205]: virt-5.0 Apr 24 00:07:15 user nova-compute[71205]: npcm750-evb Apr 24 00:07:15 user nova-compute[71205]: virt-2.10 Apr 24 00:07:15 user nova-compute[71205]: rainier-bmc Apr 24 00:07:15 user nova-compute[71205]: mps3-an547 Apr 24 00:07:15 user nova-compute[71205]: virt-2.8 Apr 24 00:07:15 user nova-compute[71205]: musca-b1 Apr 24 00:07:15 user nova-compute[71205]: realview-pbx-a9 Apr 24 00:07:15 user nova-compute[71205]: versatileab Apr 24 00:07:15 user nova-compute[71205]: kzm Apr 24 00:07:15 user nova-compute[71205]: musca-a Apr 24 00:07:15 user nova-compute[71205]: virt-3.1 Apr 24 00:07:15 user nova-compute[71205]: mcimx6ul-evk Apr 24 00:07:15 user nova-compute[71205]: virt-5.1 Apr 24 00:07:15 user nova-compute[71205]: smdkc210 Apr 24 00:07:15 user nova-compute[71205]: sx1 Apr 24 00:07:15 user nova-compute[71205]: virt-2.11 Apr 24 00:07:15 user nova-compute[71205]: imx25-pdk Apr 24 00:07:15 user nova-compute[71205]: stm32vldiscovery Apr 24 00:07:15 user nova-compute[71205]: virt-2.9 Apr 24 00:07:15 user nova-compute[71205]: orangepi-pc Apr 24 00:07:15 user nova-compute[71205]: quanta-q71l-bmc Apr 24 00:07:15 user nova-compute[71205]: z2 Apr 24 00:07:15 user nova-compute[71205]: virt-5.2 Apr 24 00:07:15 user nova-compute[71205]: xilinx-zynq-a9 Apr 24 00:07:15 user nova-compute[71205]: xlnx-zcu102 Apr 24 00:07:15 user nova-compute[71205]: tosa Apr 24 00:07:15 user nova-compute[71205]: mps2-an500 Apr 24 00:07:15 user nova-compute[71205]: virt-2.12 Apr 24 00:07:15 user nova-compute[71205]: mps2-an521 Apr 24 00:07:15 user nova-compute[71205]: sabrelite Apr 24 00:07:15 user nova-compute[71205]: mps2-an511 Apr 24 00:07:15 user nova-compute[71205]: canon-a1100 Apr 24 00:07:15 user nova-compute[71205]: realview-eb Apr 24 00:07:15 user nova-compute[71205]: quanta-gbs-bmc Apr 24 00:07:15 user nova-compute[71205]: emcraft-sf2 Apr 24 00:07:15 user nova-compute[71205]: realview-pb-a8 Apr 24 00:07:15 user nova-compute[71205]: sbsa-ref Apr 24 00:07:15 user nova-compute[71205]: virt-4.0 Apr 24 00:07:15 user nova-compute[71205]: raspi1ap Apr 24 00:07:15 user nova-compute[71205]: palmetto-bmc Apr 24 00:07:15 user nova-compute[71205]: sx1-v1 Apr 24 00:07:15 user nova-compute[71205]: n810 Apr 24 00:07:15 user nova-compute[71205]: g220a-bmc Apr 24 00:07:15 user nova-compute[71205]: n800 Apr 24 00:07:15 user nova-compute[71205]: tacoma-bmc Apr 24 00:07:15 user nova-compute[71205]: virt-4.1 Apr 24 00:07:15 user nova-compute[71205]: quanta-gsj Apr 24 00:07:15 user nova-compute[71205]: versatilepb Apr 24 00:07:15 user nova-compute[71205]: terrier Apr 24 00:07:15 user nova-compute[71205]: mainstone Apr 24 00:07:15 user nova-compute[71205]: realview-eb-mpcore Apr 24 00:07:15 user nova-compute[71205]: supermicrox11-bmc Apr 24 00:07:15 user nova-compute[71205]: virt-4.2 Apr 24 00:07:15 user nova-compute[71205]: witherspoon-bmc Apr 24 00:07:15 user nova-compute[71205]: mps3-an524 Apr 24 00:07:15 user nova-compute[71205]: swift-bmc Apr 24 00:07:15 user nova-compute[71205]: kudo-bmc Apr 24 00:07:15 user nova-compute[71205]: vexpress-a9 Apr 24 00:07:15 user nova-compute[71205]: midway Apr 24 00:07:15 user nova-compute[71205]: musicpal Apr 24 00:07:15 user nova-compute[71205]: lm3s811evb Apr 24 00:07:15 user nova-compute[71205]: lm3s6965evb Apr 24 00:07:15 user nova-compute[71205]: microbit Apr 24 00:07:15 user nova-compute[71205]: mps2-an505 Apr 24 00:07:15 user nova-compute[71205]: mps2-an385 Apr 24 00:07:15 user nova-compute[71205]: virt-6.0 Apr 24 00:07:15 user nova-compute[71205]: raspi3ap Apr 24 00:07:15 user nova-compute[71205]: cubieboard Apr 24 00:07:15 user nova-compute[71205]: verdex Apr 24 00:07:15 user nova-compute[71205]: netduino2 Apr 24 00:07:15 user nova-compute[71205]: xlnx-versal-virt Apr 24 00:07:15 user nova-compute[71205]: mps2-an386 Apr 24 00:07:15 user nova-compute[71205]: virt-6.1 Apr 24 00:07:15 user nova-compute[71205]: raspi3b Apr 24 00:07:15 user nova-compute[71205]: raspi2b Apr 24 00:07:15 user nova-compute[71205]: vexpress-a15 Apr 24 00:07:15 user nova-compute[71205]: fuji-bmc Apr 24 00:07:15 user nova-compute[71205]: virt-6.2 Apr 24 00:07:15 user nova-compute[71205]: virt Apr 24 00:07:15 user nova-compute[71205]: sonorapass-bmc Apr 24 00:07:15 user nova-compute[71205]: cheetah Apr 24 00:07:15 user nova-compute[71205]: virt-2.6 Apr 24 00:07:15 user nova-compute[71205]: ast2500-evb Apr 24 00:07:15 user nova-compute[71205]: highbank Apr 24 00:07:15 user nova-compute[71205]: akita Apr 24 00:07:15 user nova-compute[71205]: connex Apr 24 00:07:15 user nova-compute[71205]: netduinoplus2 Apr 24 00:07:15 user nova-compute[71205]: collie Apr 24 00:07:15 user nova-compute[71205]: raspi0 Apr 24 00:07:15 user nova-compute[71205]: fp5280g2-bmc Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-cris Apr 24 00:07:15 user nova-compute[71205]: axis-dev88 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-i386 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-jammy Apr 24 00:07:15 user nova-compute[71205]: ubuntu Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-impish-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-q35-5.2 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.12 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.0 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-xenial Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-6.2 Apr 24 00:07:15 user nova-compute[71205]: pc Apr 24 00:07:15 user nova-compute[71205]: pc-q35-4.2 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.5 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-4.2 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-focal Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-hirsute Apr 24 00:07:15 user nova-compute[71205]: pc-q35-xenial Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-jammy-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-5.2 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-1.5 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.7 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-eoan-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-zesty Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-disco-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-q35-groovy Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-groovy Apr 24 00:07:15 user nova-compute[71205]: pc-q35-artful Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.2 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-trusty Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-eoan-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-q35-focal-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-q35-bionic-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-artful Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.7 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-6.1 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-yakkety Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.4 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-cosmic-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.10 Apr 24 00:07:15 user nova-compute[71205]: x-remote Apr 24 00:07:15 user nova-compute[71205]: pc-q35-5.1 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-1.7 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.9 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.11 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-3.1 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-6.1 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-4.1 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-jammy Apr 24 00:07:15 user nova-compute[71205]: ubuntu-q35 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.4 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-4.1 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-eoan Apr 24 00:07:15 user nova-compute[71205]: pc-q35-jammy-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-5.1 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.9 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-bionic-hpb Apr 24 00:07:15 user nova-compute[71205]: isapc Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-1.4 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-cosmic Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.6 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-3.1 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-bionic Apr 24 00:07:15 user nova-compute[71205]: pc-q35-disco-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-cosmic Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.12 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-bionic Apr 24 00:07:15 user nova-compute[71205]: pc-q35-groovy-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-q35-disco Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-cosmic-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.1 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-wily Apr 24 00:07:15 user nova-compute[71205]: pc-q35-impish Apr 24 00:07:15 user nova-compute[71205]: pc-q35-6.0 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-impish Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.6 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-impish-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-q35-hirsute Apr 24 00:07:15 user nova-compute[71205]: pc-q35-4.0.1 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-hirsute-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-1.6 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-5.0 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.8 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.10 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-3.0 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-6.0 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-zesty Apr 24 00:07:15 user nova-compute[71205]: pc-q35-4.0 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-focal Apr 24 00:07:15 user nova-compute[71205]: microvm Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.3 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-focal-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-disco Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-4.0 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-groovy-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-hirsute-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-5.0 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-6.2 Apr 24 00:07:15 user nova-compute[71205]: q35 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.8 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-eoan Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.5 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-3.0 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-yakkety Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.11 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-m68k Apr 24 00:07:15 user nova-compute[71205]: mcf5208evb Apr 24 00:07:15 user nova-compute[71205]: an5206 Apr 24 00:07:15 user nova-compute[71205]: virt-6.0 Apr 24 00:07:15 user nova-compute[71205]: q800 Apr 24 00:07:15 user nova-compute[71205]: virt-6.2 Apr 24 00:07:15 user nova-compute[71205]: virt Apr 24 00:07:15 user nova-compute[71205]: next-cube Apr 24 00:07:15 user nova-compute[71205]: virt-6.1 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-microblaze Apr 24 00:07:15 user nova-compute[71205]: petalogix-s3adsp1800 Apr 24 00:07:15 user nova-compute[71205]: petalogix-ml605 Apr 24 00:07:15 user nova-compute[71205]: xlnx-zynqmp-pmu Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-microblazeel Apr 24 00:07:15 user nova-compute[71205]: petalogix-s3adsp1800 Apr 24 00:07:15 user nova-compute[71205]: petalogix-ml605 Apr 24 00:07:15 user nova-compute[71205]: xlnx-zynqmp-pmu Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-mips Apr 24 00:07:15 user nova-compute[71205]: malta Apr 24 00:07:15 user nova-compute[71205]: mipssim Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-mipsel Apr 24 00:07:15 user nova-compute[71205]: malta Apr 24 00:07:15 user nova-compute[71205]: mipssim Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 64 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-mips64 Apr 24 00:07:15 user nova-compute[71205]: malta Apr 24 00:07:15 user nova-compute[71205]: mipssim Apr 24 00:07:15 user nova-compute[71205]: pica61 Apr 24 00:07:15 user nova-compute[71205]: magnum Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 64 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-mips64el Apr 24 00:07:15 user nova-compute[71205]: malta Apr 24 00:07:15 user nova-compute[71205]: loongson3-virt Apr 24 00:07:15 user nova-compute[71205]: mipssim Apr 24 00:07:15 user nova-compute[71205]: pica61 Apr 24 00:07:15 user nova-compute[71205]: magnum Apr 24 00:07:15 user nova-compute[71205]: boston Apr 24 00:07:15 user nova-compute[71205]: fuloong2e Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-ppc Apr 24 00:07:15 user nova-compute[71205]: g3beige Apr 24 00:07:15 user nova-compute[71205]: virtex-ml507 Apr 24 00:07:15 user nova-compute[71205]: mac99 Apr 24 00:07:15 user nova-compute[71205]: ppce500 Apr 24 00:07:15 user nova-compute[71205]: pegasos2 Apr 24 00:07:15 user nova-compute[71205]: sam460ex Apr 24 00:07:15 user nova-compute[71205]: bamboo Apr 24 00:07:15 user nova-compute[71205]: 40p Apr 24 00:07:15 user nova-compute[71205]: ref405ep Apr 24 00:07:15 user nova-compute[71205]: mpc8544ds Apr 24 00:07:15 user nova-compute[71205]: taihu Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 64 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-ppc64 Apr 24 00:07:15 user nova-compute[71205]: pseries-jammy Apr 24 00:07:15 user nova-compute[71205]: pseries Apr 24 00:07:15 user nova-compute[71205]: powernv9 Apr 24 00:07:15 user nova-compute[71205]: powernv Apr 24 00:07:15 user nova-compute[71205]: taihu Apr 24 00:07:15 user nova-compute[71205]: pseries-4.1 Apr 24 00:07:15 user nova-compute[71205]: mpc8544ds Apr 24 00:07:15 user nova-compute[71205]: pseries-6.1 Apr 24 00:07:15 user nova-compute[71205]: pseries-2.5 Apr 24 00:07:15 user nova-compute[71205]: powernv10 Apr 24 00:07:15 user nova-compute[71205]: pseries-xenial Apr 24 00:07:15 user nova-compute[71205]: pseries-4.2 Apr 24 00:07:15 user nova-compute[71205]: pseries-6.2 Apr 24 00:07:15 user nova-compute[71205]: pseries-yakkety Apr 24 00:07:15 user nova-compute[71205]: pseries-2.6 Apr 24 00:07:15 user nova-compute[71205]: ppce500 Apr 24 00:07:15 user nova-compute[71205]: pseries-bionic-sxxm Apr 24 00:07:15 user nova-compute[71205]: pseries-2.7 Apr 24 00:07:15 user nova-compute[71205]: pseries-3.0 Apr 24 00:07:15 user nova-compute[71205]: pseries-5.0 Apr 24 00:07:15 user nova-compute[71205]: 40p Apr 24 00:07:15 user nova-compute[71205]: pseries-2.8 Apr 24 00:07:15 user nova-compute[71205]: pegasos2 Apr 24 00:07:15 user nova-compute[71205]: pseries-hirsute Apr 24 00:07:15 user nova-compute[71205]: pseries-3.1 Apr 24 00:07:15 user nova-compute[71205]: pseries-5.1 Apr 24 00:07:15 user nova-compute[71205]: pseries-eoan Apr 24 00:07:15 user nova-compute[71205]: pseries-2.9 Apr 24 00:07:15 user nova-compute[71205]: pseries-zesty Apr 24 00:07:15 user nova-compute[71205]: bamboo Apr 24 00:07:15 user nova-compute[71205]: pseries-groovy Apr 24 00:07:15 user nova-compute[71205]: pseries-focal Apr 24 00:07:15 user nova-compute[71205]: g3beige Apr 24 00:07:15 user nova-compute[71205]: pseries-5.2 Apr 24 00:07:15 user nova-compute[71205]: pseries-disco Apr 24 00:07:15 user nova-compute[71205]: pseries-2.12-sxxm Apr 24 00:07:15 user nova-compute[71205]: pseries-2.10 Apr 24 00:07:15 user nova-compute[71205]: virtex-ml507 Apr 24 00:07:15 user nova-compute[71205]: pseries-2.11 Apr 24 00:07:15 user nova-compute[71205]: pseries-2.1 Apr 24 00:07:15 user nova-compute[71205]: pseries-cosmic Apr 24 00:07:15 user nova-compute[71205]: pseries-bionic Apr 24 00:07:15 user nova-compute[71205]: pseries-2.12 Apr 24 00:07:15 user nova-compute[71205]: pseries-2.2 Apr 24 00:07:15 user nova-compute[71205]: mac99 Apr 24 00:07:15 user nova-compute[71205]: pseries-impish Apr 24 00:07:15 user nova-compute[71205]: pseries-artful Apr 24 00:07:15 user nova-compute[71205]: sam460ex Apr 24 00:07:15 user nova-compute[71205]: ref405ep Apr 24 00:07:15 user nova-compute[71205]: pseries-2.3 Apr 24 00:07:15 user nova-compute[71205]: powernv8 Apr 24 00:07:15 user nova-compute[71205]: pseries-4.0 Apr 24 00:07:15 user nova-compute[71205]: pseries-6.0 Apr 24 00:07:15 user nova-compute[71205]: pseries-2.4 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 64 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-ppc64le Apr 24 00:07:15 user nova-compute[71205]: pseries-jammy Apr 24 00:07:15 user nova-compute[71205]: pseries Apr 24 00:07:15 user nova-compute[71205]: powernv9 Apr 24 00:07:15 user nova-compute[71205]: powernv Apr 24 00:07:15 user nova-compute[71205]: taihu Apr 24 00:07:15 user nova-compute[71205]: pseries-4.1 Apr 24 00:07:15 user nova-compute[71205]: mpc8544ds Apr 24 00:07:15 user nova-compute[71205]: pseries-6.1 Apr 24 00:07:15 user nova-compute[71205]: pseries-2.5 Apr 24 00:07:15 user nova-compute[71205]: powernv10 Apr 24 00:07:15 user nova-compute[71205]: pseries-xenial Apr 24 00:07:15 user nova-compute[71205]: pseries-4.2 Apr 24 00:07:15 user nova-compute[71205]: pseries-6.2 Apr 24 00:07:15 user nova-compute[71205]: pseries-yakkety Apr 24 00:07:15 user nova-compute[71205]: pseries-2.6 Apr 24 00:07:15 user nova-compute[71205]: ppce500 Apr 24 00:07:15 user nova-compute[71205]: pseries-bionic-sxxm Apr 24 00:07:15 user nova-compute[71205]: pseries-2.7 Apr 24 00:07:15 user nova-compute[71205]: pseries-3.0 Apr 24 00:07:15 user nova-compute[71205]: pseries-5.0 Apr 24 00:07:15 user nova-compute[71205]: 40p Apr 24 00:07:15 user nova-compute[71205]: pseries-2.8 Apr 24 00:07:15 user nova-compute[71205]: pegasos2 Apr 24 00:07:15 user nova-compute[71205]: pseries-hirsute Apr 24 00:07:15 user nova-compute[71205]: pseries-3.1 Apr 24 00:07:15 user nova-compute[71205]: pseries-5.1 Apr 24 00:07:15 user nova-compute[71205]: pseries-eoan Apr 24 00:07:15 user nova-compute[71205]: pseries-2.9 Apr 24 00:07:15 user nova-compute[71205]: pseries-zesty Apr 24 00:07:15 user nova-compute[71205]: bamboo Apr 24 00:07:15 user nova-compute[71205]: pseries-groovy Apr 24 00:07:15 user nova-compute[71205]: pseries-focal Apr 24 00:07:15 user nova-compute[71205]: g3beige Apr 24 00:07:15 user nova-compute[71205]: pseries-5.2 Apr 24 00:07:15 user nova-compute[71205]: pseries-disco Apr 24 00:07:15 user nova-compute[71205]: pseries-2.12-sxxm Apr 24 00:07:15 user nova-compute[71205]: pseries-2.10 Apr 24 00:07:15 user nova-compute[71205]: virtex-ml507 Apr 24 00:07:15 user nova-compute[71205]: pseries-2.11 Apr 24 00:07:15 user nova-compute[71205]: pseries-2.1 Apr 24 00:07:15 user nova-compute[71205]: pseries-cosmic Apr 24 00:07:15 user nova-compute[71205]: pseries-bionic Apr 24 00:07:15 user nova-compute[71205]: pseries-2.12 Apr 24 00:07:15 user nova-compute[71205]: pseries-2.2 Apr 24 00:07:15 user nova-compute[71205]: mac99 Apr 24 00:07:15 user nova-compute[71205]: pseries-impish Apr 24 00:07:15 user nova-compute[71205]: pseries-artful Apr 24 00:07:15 user nova-compute[71205]: sam460ex Apr 24 00:07:15 user nova-compute[71205]: ref405ep Apr 24 00:07:15 user nova-compute[71205]: pseries-2.3 Apr 24 00:07:15 user nova-compute[71205]: powernv8 Apr 24 00:07:15 user nova-compute[71205]: pseries-4.0 Apr 24 00:07:15 user nova-compute[71205]: pseries-6.0 Apr 24 00:07:15 user nova-compute[71205]: pseries-2.4 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-riscv32 Apr 24 00:07:15 user nova-compute[71205]: spike Apr 24 00:07:15 user nova-compute[71205]: opentitan Apr 24 00:07:15 user nova-compute[71205]: sifive_u Apr 24 00:07:15 user nova-compute[71205]: sifive_e Apr 24 00:07:15 user nova-compute[71205]: virt Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 64 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-riscv64 Apr 24 00:07:15 user nova-compute[71205]: spike Apr 24 00:07:15 user nova-compute[71205]: microchip-icicle-kit Apr 24 00:07:15 user nova-compute[71205]: sifive_u Apr 24 00:07:15 user nova-compute[71205]: shakti_c Apr 24 00:07:15 user nova-compute[71205]: sifive_e Apr 24 00:07:15 user nova-compute[71205]: virt Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 64 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-s390x Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-jammy Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-4.0 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-5.2 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-artful Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-3.1 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-groovy Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-hirsute Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-disco Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-2.12 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-2.6 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-yakkety Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-eoan Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-2.9 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-6.0 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-5.1 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-3.0 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-4.2 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-2.5 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-2.11 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-xenial Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-focal Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-2.8 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-impish Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-bionic Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-5.0 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-6.2 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-zesty Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-4.1 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-cosmic Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-2.4 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-2.10 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-2.7 Apr 24 00:07:15 user nova-compute[71205]: s390-ccw-virtio-6.1 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-sh4 Apr 24 00:07:15 user nova-compute[71205]: shix Apr 24 00:07:15 user nova-compute[71205]: r2d Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 64 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-sh4eb Apr 24 00:07:15 user nova-compute[71205]: shix Apr 24 00:07:15 user nova-compute[71205]: r2d Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-sparc Apr 24 00:07:15 user nova-compute[71205]: SS-5 Apr 24 00:07:15 user nova-compute[71205]: SS-20 Apr 24 00:07:15 user nova-compute[71205]: LX Apr 24 00:07:15 user nova-compute[71205]: SPARCClassic Apr 24 00:07:15 user nova-compute[71205]: leon3_generic Apr 24 00:07:15 user nova-compute[71205]: SPARCbook Apr 24 00:07:15 user nova-compute[71205]: SS-4 Apr 24 00:07:15 user nova-compute[71205]: SS-600MP Apr 24 00:07:15 user nova-compute[71205]: SS-10 Apr 24 00:07:15 user nova-compute[71205]: Voyager Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 64 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-sparc64 Apr 24 00:07:15 user nova-compute[71205]: sun4u Apr 24 00:07:15 user nova-compute[71205]: niagara Apr 24 00:07:15 user nova-compute[71205]: sun4v Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 64 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-x86_64 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-jammy Apr 24 00:07:15 user nova-compute[71205]: ubuntu Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-impish-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-q35-5.2 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.12 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.0 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-xenial Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-6.2 Apr 24 00:07:15 user nova-compute[71205]: pc Apr 24 00:07:15 user nova-compute[71205]: pc-q35-4.2 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.5 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-4.2 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-hirsute Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-focal Apr 24 00:07:15 user nova-compute[71205]: pc-q35-xenial Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-jammy-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-5.2 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-1.5 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.7 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-eoan-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-zesty Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-disco-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-q35-groovy Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-groovy Apr 24 00:07:15 user nova-compute[71205]: pc-q35-artful Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-trusty Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.2 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-focal-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-eoan-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-q35-bionic-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-artful Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.7 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-6.1 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-yakkety Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.4 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-cosmic-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.10 Apr 24 00:07:15 user nova-compute[71205]: x-remote Apr 24 00:07:15 user nova-compute[71205]: pc-q35-5.1 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-1.7 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.9 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.11 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-3.1 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-6.1 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-4.1 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-jammy Apr 24 00:07:15 user nova-compute[71205]: ubuntu-q35 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.4 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-4.1 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-eoan Apr 24 00:07:15 user nova-compute[71205]: pc-q35-jammy-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-5.1 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.9 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-bionic-hpb Apr 24 00:07:15 user nova-compute[71205]: isapc Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-1.4 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-cosmic Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.6 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-3.1 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-bionic Apr 24 00:07:15 user nova-compute[71205]: pc-q35-disco-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-cosmic Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.12 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-bionic Apr 24 00:07:15 user nova-compute[71205]: pc-q35-groovy-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-q35-disco Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-cosmic-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.1 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-wily Apr 24 00:07:15 user nova-compute[71205]: pc-q35-impish Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.6 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-6.0 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-impish Apr 24 00:07:15 user nova-compute[71205]: pc-q35-impish-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-q35-hirsute Apr 24 00:07:15 user nova-compute[71205]: pc-q35-4.0.1 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-hirsute-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-1.6 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-5.0 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.8 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.10 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-3.0 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-zesty Apr 24 00:07:15 user nova-compute[71205]: pc-q35-4.0 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-focal Apr 24 00:07:15 user nova-compute[71205]: microvm Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-6.0 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.3 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-disco Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-focal-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-4.0 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-groovy-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-hirsute-hpb Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-5.0 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-2.8 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-6.2 Apr 24 00:07:15 user nova-compute[71205]: q35 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-eoan Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.5 Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-3.0 Apr 24 00:07:15 user nova-compute[71205]: pc-q35-yakkety Apr 24 00:07:15 user nova-compute[71205]: pc-q35-2.11 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-xtensa Apr 24 00:07:15 user nova-compute[71205]: sim Apr 24 00:07:15 user nova-compute[71205]: kc705 Apr 24 00:07:15 user nova-compute[71205]: ml605 Apr 24 00:07:15 user nova-compute[71205]: ml605-nommu Apr 24 00:07:15 user nova-compute[71205]: virt Apr 24 00:07:15 user nova-compute[71205]: lx60-nommu Apr 24 00:07:15 user nova-compute[71205]: lx200 Apr 24 00:07:15 user nova-compute[71205]: lx200-nommu Apr 24 00:07:15 user nova-compute[71205]: lx60 Apr 24 00:07:15 user nova-compute[71205]: kc705-nommu Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: hvm Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: 32 Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-xtensaeb Apr 24 00:07:15 user nova-compute[71205]: sim Apr 24 00:07:15 user nova-compute[71205]: kc705 Apr 24 00:07:15 user nova-compute[71205]: ml605 Apr 24 00:07:15 user nova-compute[71205]: ml605-nommu Apr 24 00:07:15 user nova-compute[71205]: virt Apr 24 00:07:15 user nova-compute[71205]: lx60-nommu Apr 24 00:07:15 user nova-compute[71205]: lx200 Apr 24 00:07:15 user nova-compute[71205]: lx200-nommu Apr 24 00:07:15 user nova-compute[71205]: lx60 Apr 24 00:07:15 user nova-compute[71205]: kc705-nommu Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for alpha via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch alpha / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-alpha' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for armv6l via machine types: {'virt', None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for armv7l via machine types: {'virt'} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch armv7l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for aarch64 via machine types: {'virt'} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch aarch64 / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-aarch64' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for cris via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch cris / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-cris' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for i686 via machine types: {'ubuntu-q35', 'ubuntu', 'q35', 'pc'} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu-q35: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-i386 Apr 24 00:07:15 user nova-compute[71205]: kvm Apr 24 00:07:15 user nova-compute[71205]: pc-q35-jammy Apr 24 00:07:15 user nova-compute[71205]: i686 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: rom Apr 24 00:07:15 user nova-compute[71205]: pflash Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: yes Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: on Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: Intel Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: qemu64 Apr 24 00:07:15 user nova-compute[71205]: qemu32 Apr 24 00:07:15 user nova-compute[71205]: phenom Apr 24 00:07:15 user nova-compute[71205]: pentium3 Apr 24 00:07:15 user nova-compute[71205]: pentium2 Apr 24 00:07:15 user nova-compute[71205]: pentium Apr 24 00:07:15 user nova-compute[71205]: n270 Apr 24 00:07:15 user nova-compute[71205]: kvm64 Apr 24 00:07:15 user nova-compute[71205]: kvm32 Apr 24 00:07:15 user nova-compute[71205]: coreduo Apr 24 00:07:15 user nova-compute[71205]: core2duo Apr 24 00:07:15 user nova-compute[71205]: athlon Apr 24 00:07:15 user nova-compute[71205]: Westmere-IBRS Apr 24 00:07:15 user nova-compute[71205]: Westmere Apr 24 00:07:15 user nova-compute[71205]: Snowridge Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client Apr 24 00:07:15 user nova-compute[71205]: SandyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: SandyBridge Apr 24 00:07:15 user nova-compute[71205]: Penryn Apr 24 00:07:15 user nova-compute[71205]: Opteron_G5 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G4 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G3 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G2 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G1 Apr 24 00:07:15 user nova-compute[71205]: Nehalem-IBRS Apr 24 00:07:15 user nova-compute[71205]: Nehalem Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: IvyBridge Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Haswell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell Apr 24 00:07:15 user nova-compute[71205]: EPYC-Rome Apr 24 00:07:15 user nova-compute[71205]: EPYC-Milan Apr 24 00:07:15 user nova-compute[71205]: EPYC-IBPB Apr 24 00:07:15 user nova-compute[71205]: EPYC Apr 24 00:07:15 user nova-compute[71205]: Dhyana Apr 24 00:07:15 user nova-compute[71205]: Cooperlake Apr 24 00:07:15 user nova-compute[71205]: Conroe Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Broadwell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell Apr 24 00:07:15 user nova-compute[71205]: 486 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: file Apr 24 00:07:15 user nova-compute[71205]: anonymous Apr 24 00:07:15 user nova-compute[71205]: memfd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: disk Apr 24 00:07:15 user nova-compute[71205]: cdrom Apr 24 00:07:15 user nova-compute[71205]: floppy Apr 24 00:07:15 user nova-compute[71205]: lun Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: fdc Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: sata Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: sdl Apr 24 00:07:15 user nova-compute[71205]: vnc Apr 24 00:07:15 user nova-compute[71205]: spice Apr 24 00:07:15 user nova-compute[71205]: egl-headless Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: subsystem Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: default Apr 24 00:07:15 user nova-compute[71205]: mandatory Apr 24 00:07:15 user nova-compute[71205]: requisite Apr 24 00:07:15 user nova-compute[71205]: optional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: pci Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: random Apr 24 00:07:15 user nova-compute[71205]: egd Apr 24 00:07:15 user nova-compute[71205]: builtin Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: path Apr 24 00:07:15 user nova-compute[71205]: handle Apr 24 00:07:15 user nova-compute[71205]: virtiofs Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: tpm-tis Apr 24 00:07:15 user nova-compute[71205]: tpm-crb Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: passthrough Apr 24 00:07:15 user nova-compute[71205]: emulator Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: {{(pid=71205) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-i386 Apr 24 00:07:15 user nova-compute[71205]: kvm Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-jammy Apr 24 00:07:15 user nova-compute[71205]: i686 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: rom Apr 24 00:07:15 user nova-compute[71205]: pflash Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: yes Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: on Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: Intel Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: qemu64 Apr 24 00:07:15 user nova-compute[71205]: qemu32 Apr 24 00:07:15 user nova-compute[71205]: phenom Apr 24 00:07:15 user nova-compute[71205]: pentium3 Apr 24 00:07:15 user nova-compute[71205]: pentium2 Apr 24 00:07:15 user nova-compute[71205]: pentium Apr 24 00:07:15 user nova-compute[71205]: n270 Apr 24 00:07:15 user nova-compute[71205]: kvm64 Apr 24 00:07:15 user nova-compute[71205]: kvm32 Apr 24 00:07:15 user nova-compute[71205]: coreduo Apr 24 00:07:15 user nova-compute[71205]: core2duo Apr 24 00:07:15 user nova-compute[71205]: athlon Apr 24 00:07:15 user nova-compute[71205]: Westmere-IBRS Apr 24 00:07:15 user nova-compute[71205]: Westmere Apr 24 00:07:15 user nova-compute[71205]: Snowridge Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client Apr 24 00:07:15 user nova-compute[71205]: SandyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: SandyBridge Apr 24 00:07:15 user nova-compute[71205]: Penryn Apr 24 00:07:15 user nova-compute[71205]: Opteron_G5 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G4 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G3 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G2 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G1 Apr 24 00:07:15 user nova-compute[71205]: Nehalem-IBRS Apr 24 00:07:15 user nova-compute[71205]: Nehalem Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: IvyBridge Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Haswell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell Apr 24 00:07:15 user nova-compute[71205]: EPYC-Rome Apr 24 00:07:15 user nova-compute[71205]: EPYC-Milan Apr 24 00:07:15 user nova-compute[71205]: EPYC-IBPB Apr 24 00:07:15 user nova-compute[71205]: EPYC Apr 24 00:07:15 user nova-compute[71205]: Dhyana Apr 24 00:07:15 user nova-compute[71205]: Cooperlake Apr 24 00:07:15 user nova-compute[71205]: Conroe Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Broadwell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell Apr 24 00:07:15 user nova-compute[71205]: 486 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: file Apr 24 00:07:15 user nova-compute[71205]: anonymous Apr 24 00:07:15 user nova-compute[71205]: memfd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: disk Apr 24 00:07:15 user nova-compute[71205]: cdrom Apr 24 00:07:15 user nova-compute[71205]: floppy Apr 24 00:07:15 user nova-compute[71205]: lun Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: ide Apr 24 00:07:15 user nova-compute[71205]: fdc Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: sata Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: sdl Apr 24 00:07:15 user nova-compute[71205]: vnc Apr 24 00:07:15 user nova-compute[71205]: spice Apr 24 00:07:15 user nova-compute[71205]: egl-headless Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: subsystem Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: default Apr 24 00:07:15 user nova-compute[71205]: mandatory Apr 24 00:07:15 user nova-compute[71205]: requisite Apr 24 00:07:15 user nova-compute[71205]: optional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: pci Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: random Apr 24 00:07:15 user nova-compute[71205]: egd Apr 24 00:07:15 user nova-compute[71205]: builtin Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: path Apr 24 00:07:15 user nova-compute[71205]: handle Apr 24 00:07:15 user nova-compute[71205]: virtiofs Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: tpm-tis Apr 24 00:07:15 user nova-compute[71205]: tpm-crb Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: passthrough Apr 24 00:07:15 user nova-compute[71205]: emulator Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: {{(pid=71205) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-i386 Apr 24 00:07:15 user nova-compute[71205]: kvm Apr 24 00:07:15 user nova-compute[71205]: pc-q35-6.2 Apr 24 00:07:15 user nova-compute[71205]: i686 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: rom Apr 24 00:07:15 user nova-compute[71205]: pflash Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: yes Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: on Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: Intel Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: qemu64 Apr 24 00:07:15 user nova-compute[71205]: qemu32 Apr 24 00:07:15 user nova-compute[71205]: phenom Apr 24 00:07:15 user nova-compute[71205]: pentium3 Apr 24 00:07:15 user nova-compute[71205]: pentium2 Apr 24 00:07:15 user nova-compute[71205]: pentium Apr 24 00:07:15 user nova-compute[71205]: n270 Apr 24 00:07:15 user nova-compute[71205]: kvm64 Apr 24 00:07:15 user nova-compute[71205]: kvm32 Apr 24 00:07:15 user nova-compute[71205]: coreduo Apr 24 00:07:15 user nova-compute[71205]: core2duo Apr 24 00:07:15 user nova-compute[71205]: athlon Apr 24 00:07:15 user nova-compute[71205]: Westmere-IBRS Apr 24 00:07:15 user nova-compute[71205]: Westmere Apr 24 00:07:15 user nova-compute[71205]: Snowridge Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client Apr 24 00:07:15 user nova-compute[71205]: SandyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: SandyBridge Apr 24 00:07:15 user nova-compute[71205]: Penryn Apr 24 00:07:15 user nova-compute[71205]: Opteron_G5 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G4 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G3 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G2 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G1 Apr 24 00:07:15 user nova-compute[71205]: Nehalem-IBRS Apr 24 00:07:15 user nova-compute[71205]: Nehalem Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: IvyBridge Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Haswell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell Apr 24 00:07:15 user nova-compute[71205]: EPYC-Rome Apr 24 00:07:15 user nova-compute[71205]: EPYC-Milan Apr 24 00:07:15 user nova-compute[71205]: EPYC-IBPB Apr 24 00:07:15 user nova-compute[71205]: EPYC Apr 24 00:07:15 user nova-compute[71205]: Dhyana Apr 24 00:07:15 user nova-compute[71205]: Cooperlake Apr 24 00:07:15 user nova-compute[71205]: Conroe Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Broadwell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell Apr 24 00:07:15 user nova-compute[71205]: 486 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: file Apr 24 00:07:15 user nova-compute[71205]: anonymous Apr 24 00:07:15 user nova-compute[71205]: memfd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: disk Apr 24 00:07:15 user nova-compute[71205]: cdrom Apr 24 00:07:15 user nova-compute[71205]: floppy Apr 24 00:07:15 user nova-compute[71205]: lun Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: fdc Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: sata Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: sdl Apr 24 00:07:15 user nova-compute[71205]: vnc Apr 24 00:07:15 user nova-compute[71205]: spice Apr 24 00:07:15 user nova-compute[71205]: egl-headless Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: subsystem Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: default Apr 24 00:07:15 user nova-compute[71205]: mandatory Apr 24 00:07:15 user nova-compute[71205]: requisite Apr 24 00:07:15 user nova-compute[71205]: optional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: pci Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: random Apr 24 00:07:15 user nova-compute[71205]: egd Apr 24 00:07:15 user nova-compute[71205]: builtin Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: path Apr 24 00:07:15 user nova-compute[71205]: handle Apr 24 00:07:15 user nova-compute[71205]: virtiofs Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: tpm-tis Apr 24 00:07:15 user nova-compute[71205]: tpm-crb Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: passthrough Apr 24 00:07:15 user nova-compute[71205]: emulator Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: {{(pid=71205) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-i386 Apr 24 00:07:15 user nova-compute[71205]: kvm Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-6.2 Apr 24 00:07:15 user nova-compute[71205]: i686 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: rom Apr 24 00:07:15 user nova-compute[71205]: pflash Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: yes Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: on Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: Intel Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: qemu64 Apr 24 00:07:15 user nova-compute[71205]: qemu32 Apr 24 00:07:15 user nova-compute[71205]: phenom Apr 24 00:07:15 user nova-compute[71205]: pentium3 Apr 24 00:07:15 user nova-compute[71205]: pentium2 Apr 24 00:07:15 user nova-compute[71205]: pentium Apr 24 00:07:15 user nova-compute[71205]: n270 Apr 24 00:07:15 user nova-compute[71205]: kvm64 Apr 24 00:07:15 user nova-compute[71205]: kvm32 Apr 24 00:07:15 user nova-compute[71205]: coreduo Apr 24 00:07:15 user nova-compute[71205]: core2duo Apr 24 00:07:15 user nova-compute[71205]: athlon Apr 24 00:07:15 user nova-compute[71205]: Westmere-IBRS Apr 24 00:07:15 user nova-compute[71205]: Westmere Apr 24 00:07:15 user nova-compute[71205]: Snowridge Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client Apr 24 00:07:15 user nova-compute[71205]: SandyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: SandyBridge Apr 24 00:07:15 user nova-compute[71205]: Penryn Apr 24 00:07:15 user nova-compute[71205]: Opteron_G5 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G4 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G3 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G2 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G1 Apr 24 00:07:15 user nova-compute[71205]: Nehalem-IBRS Apr 24 00:07:15 user nova-compute[71205]: Nehalem Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: IvyBridge Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Haswell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell Apr 24 00:07:15 user nova-compute[71205]: EPYC-Rome Apr 24 00:07:15 user nova-compute[71205]: EPYC-Milan Apr 24 00:07:15 user nova-compute[71205]: EPYC-IBPB Apr 24 00:07:15 user nova-compute[71205]: EPYC Apr 24 00:07:15 user nova-compute[71205]: Dhyana Apr 24 00:07:15 user nova-compute[71205]: Cooperlake Apr 24 00:07:15 user nova-compute[71205]: Conroe Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Broadwell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell Apr 24 00:07:15 user nova-compute[71205]: 486 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: file Apr 24 00:07:15 user nova-compute[71205]: anonymous Apr 24 00:07:15 user nova-compute[71205]: memfd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: disk Apr 24 00:07:15 user nova-compute[71205]: cdrom Apr 24 00:07:15 user nova-compute[71205]: floppy Apr 24 00:07:15 user nova-compute[71205]: lun Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: ide Apr 24 00:07:15 user nova-compute[71205]: fdc Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: sata Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: sdl Apr 24 00:07:15 user nova-compute[71205]: vnc Apr 24 00:07:15 user nova-compute[71205]: spice Apr 24 00:07:15 user nova-compute[71205]: egl-headless Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: subsystem Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: default Apr 24 00:07:15 user nova-compute[71205]: mandatory Apr 24 00:07:15 user nova-compute[71205]: requisite Apr 24 00:07:15 user nova-compute[71205]: optional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: pci Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: random Apr 24 00:07:15 user nova-compute[71205]: egd Apr 24 00:07:15 user nova-compute[71205]: builtin Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: path Apr 24 00:07:15 user nova-compute[71205]: handle Apr 24 00:07:15 user nova-compute[71205]: virtiofs Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: tpm-tis Apr 24 00:07:15 user nova-compute[71205]: tpm-crb Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: passthrough Apr 24 00:07:15 user nova-compute[71205]: emulator Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: {{(pid=71205) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for m68k via machine types: {'virt', None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for microblaze via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch microblaze / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblaze' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for microblazeel via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch microblazeel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblazeel' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for mips via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch mips / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for mipsel via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch mipsel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mipsel' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for mips64 via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch mips64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for mips64el via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch mips64el / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64el' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for ppc via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch ppc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for ppc64 via machine types: {'powernv', None, 'pseries'} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for ppc64le via machine types: {'pseries', 'powernv'} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for riscv32 via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch riscv32 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv32' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for riscv64 via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch riscv64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv64' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for s390x via machine types: {'s390-ccw-virtio'} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch s390x / virt_type kvm / machine_type s390-ccw-virtio: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-s390x' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for sh4 via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch sh4 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for sh4eb via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch sh4eb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4eb' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for sparc via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch sparc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for sparc64 via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch sparc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc64' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for x86_64 via machine types: {'ubuntu-q35', 'ubuntu', 'q35', 'pc'} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu-q35: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-x86_64 Apr 24 00:07:15 user nova-compute[71205]: kvm Apr 24 00:07:15 user nova-compute[71205]: pc-q35-jammy Apr 24 00:07:15 user nova-compute[71205]: x86_64 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: efi Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: rom Apr 24 00:07:15 user nova-compute[71205]: pflash Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: yes Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: yes Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: on Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: on Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: Intel Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: qemu64 Apr 24 00:07:15 user nova-compute[71205]: qemu32 Apr 24 00:07:15 user nova-compute[71205]: phenom Apr 24 00:07:15 user nova-compute[71205]: pentium3 Apr 24 00:07:15 user nova-compute[71205]: pentium2 Apr 24 00:07:15 user nova-compute[71205]: pentium Apr 24 00:07:15 user nova-compute[71205]: n270 Apr 24 00:07:15 user nova-compute[71205]: kvm64 Apr 24 00:07:15 user nova-compute[71205]: kvm32 Apr 24 00:07:15 user nova-compute[71205]: coreduo Apr 24 00:07:15 user nova-compute[71205]: core2duo Apr 24 00:07:15 user nova-compute[71205]: athlon Apr 24 00:07:15 user nova-compute[71205]: Westmere-IBRS Apr 24 00:07:15 user nova-compute[71205]: Westmere Apr 24 00:07:15 user nova-compute[71205]: Snowridge Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client Apr 24 00:07:15 user nova-compute[71205]: SandyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: SandyBridge Apr 24 00:07:15 user nova-compute[71205]: Penryn Apr 24 00:07:15 user nova-compute[71205]: Opteron_G5 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G4 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G3 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G2 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G1 Apr 24 00:07:15 user nova-compute[71205]: Nehalem-IBRS Apr 24 00:07:15 user nova-compute[71205]: Nehalem Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: IvyBridge Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Haswell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell Apr 24 00:07:15 user nova-compute[71205]: EPYC-Rome Apr 24 00:07:15 user nova-compute[71205]: EPYC-Milan Apr 24 00:07:15 user nova-compute[71205]: EPYC-IBPB Apr 24 00:07:15 user nova-compute[71205]: EPYC Apr 24 00:07:15 user nova-compute[71205]: Dhyana Apr 24 00:07:15 user nova-compute[71205]: Cooperlake Apr 24 00:07:15 user nova-compute[71205]: Conroe Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Broadwell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell Apr 24 00:07:15 user nova-compute[71205]: 486 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: file Apr 24 00:07:15 user nova-compute[71205]: anonymous Apr 24 00:07:15 user nova-compute[71205]: memfd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: disk Apr 24 00:07:15 user nova-compute[71205]: cdrom Apr 24 00:07:15 user nova-compute[71205]: floppy Apr 24 00:07:15 user nova-compute[71205]: lun Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: fdc Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: sata Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: sdl Apr 24 00:07:15 user nova-compute[71205]: vnc Apr 24 00:07:15 user nova-compute[71205]: spice Apr 24 00:07:15 user nova-compute[71205]: egl-headless Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: subsystem Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: default Apr 24 00:07:15 user nova-compute[71205]: mandatory Apr 24 00:07:15 user nova-compute[71205]: requisite Apr 24 00:07:15 user nova-compute[71205]: optional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: pci Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: random Apr 24 00:07:15 user nova-compute[71205]: egd Apr 24 00:07:15 user nova-compute[71205]: builtin Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: path Apr 24 00:07:15 user nova-compute[71205]: handle Apr 24 00:07:15 user nova-compute[71205]: virtiofs Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: tpm-tis Apr 24 00:07:15 user nova-compute[71205]: tpm-crb Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: passthrough Apr 24 00:07:15 user nova-compute[71205]: emulator Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: {{(pid=71205) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-x86_64 Apr 24 00:07:15 user nova-compute[71205]: kvm Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-jammy Apr 24 00:07:15 user nova-compute[71205]: x86_64 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: efi Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: rom Apr 24 00:07:15 user nova-compute[71205]: pflash Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: yes Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: on Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: on Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: Intel Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: qemu64 Apr 24 00:07:15 user nova-compute[71205]: qemu32 Apr 24 00:07:15 user nova-compute[71205]: phenom Apr 24 00:07:15 user nova-compute[71205]: pentium3 Apr 24 00:07:15 user nova-compute[71205]: pentium2 Apr 24 00:07:15 user nova-compute[71205]: pentium Apr 24 00:07:15 user nova-compute[71205]: n270 Apr 24 00:07:15 user nova-compute[71205]: kvm64 Apr 24 00:07:15 user nova-compute[71205]: kvm32 Apr 24 00:07:15 user nova-compute[71205]: coreduo Apr 24 00:07:15 user nova-compute[71205]: core2duo Apr 24 00:07:15 user nova-compute[71205]: athlon Apr 24 00:07:15 user nova-compute[71205]: Westmere-IBRS Apr 24 00:07:15 user nova-compute[71205]: Westmere Apr 24 00:07:15 user nova-compute[71205]: Snowridge Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client Apr 24 00:07:15 user nova-compute[71205]: SandyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: SandyBridge Apr 24 00:07:15 user nova-compute[71205]: Penryn Apr 24 00:07:15 user nova-compute[71205]: Opteron_G5 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G4 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G3 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G2 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G1 Apr 24 00:07:15 user nova-compute[71205]: Nehalem-IBRS Apr 24 00:07:15 user nova-compute[71205]: Nehalem Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: IvyBridge Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Haswell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell Apr 24 00:07:15 user nova-compute[71205]: EPYC-Rome Apr 24 00:07:15 user nova-compute[71205]: EPYC-Milan Apr 24 00:07:15 user nova-compute[71205]: EPYC-IBPB Apr 24 00:07:15 user nova-compute[71205]: EPYC Apr 24 00:07:15 user nova-compute[71205]: Dhyana Apr 24 00:07:15 user nova-compute[71205]: Cooperlake Apr 24 00:07:15 user nova-compute[71205]: Conroe Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Broadwell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell Apr 24 00:07:15 user nova-compute[71205]: 486 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: file Apr 24 00:07:15 user nova-compute[71205]: anonymous Apr 24 00:07:15 user nova-compute[71205]: memfd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: disk Apr 24 00:07:15 user nova-compute[71205]: cdrom Apr 24 00:07:15 user nova-compute[71205]: floppy Apr 24 00:07:15 user nova-compute[71205]: lun Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: ide Apr 24 00:07:15 user nova-compute[71205]: fdc Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: sata Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: sdl Apr 24 00:07:15 user nova-compute[71205]: vnc Apr 24 00:07:15 user nova-compute[71205]: spice Apr 24 00:07:15 user nova-compute[71205]: egl-headless Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: subsystem Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: default Apr 24 00:07:15 user nova-compute[71205]: mandatory Apr 24 00:07:15 user nova-compute[71205]: requisite Apr 24 00:07:15 user nova-compute[71205]: optional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: pci Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: random Apr 24 00:07:15 user nova-compute[71205]: egd Apr 24 00:07:15 user nova-compute[71205]: builtin Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: path Apr 24 00:07:15 user nova-compute[71205]: handle Apr 24 00:07:15 user nova-compute[71205]: virtiofs Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: tpm-tis Apr 24 00:07:15 user nova-compute[71205]: tpm-crb Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: passthrough Apr 24 00:07:15 user nova-compute[71205]: emulator Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: {{(pid=71205) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-x86_64 Apr 24 00:07:15 user nova-compute[71205]: kvm Apr 24 00:07:15 user nova-compute[71205]: pc-q35-6.2 Apr 24 00:07:15 user nova-compute[71205]: x86_64 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: efi Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: rom Apr 24 00:07:15 user nova-compute[71205]: pflash Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: yes Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: yes Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: on Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: on Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: Intel Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: qemu64 Apr 24 00:07:15 user nova-compute[71205]: qemu32 Apr 24 00:07:15 user nova-compute[71205]: phenom Apr 24 00:07:15 user nova-compute[71205]: pentium3 Apr 24 00:07:15 user nova-compute[71205]: pentium2 Apr 24 00:07:15 user nova-compute[71205]: pentium Apr 24 00:07:15 user nova-compute[71205]: n270 Apr 24 00:07:15 user nova-compute[71205]: kvm64 Apr 24 00:07:15 user nova-compute[71205]: kvm32 Apr 24 00:07:15 user nova-compute[71205]: coreduo Apr 24 00:07:15 user nova-compute[71205]: core2duo Apr 24 00:07:15 user nova-compute[71205]: athlon Apr 24 00:07:15 user nova-compute[71205]: Westmere-IBRS Apr 24 00:07:15 user nova-compute[71205]: Westmere Apr 24 00:07:15 user nova-compute[71205]: Snowridge Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client Apr 24 00:07:15 user nova-compute[71205]: SandyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: SandyBridge Apr 24 00:07:15 user nova-compute[71205]: Penryn Apr 24 00:07:15 user nova-compute[71205]: Opteron_G5 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G4 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G3 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G2 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G1 Apr 24 00:07:15 user nova-compute[71205]: Nehalem-IBRS Apr 24 00:07:15 user nova-compute[71205]: Nehalem Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: IvyBridge Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Haswell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell Apr 24 00:07:15 user nova-compute[71205]: EPYC-Rome Apr 24 00:07:15 user nova-compute[71205]: EPYC-Milan Apr 24 00:07:15 user nova-compute[71205]: EPYC-IBPB Apr 24 00:07:15 user nova-compute[71205]: EPYC Apr 24 00:07:15 user nova-compute[71205]: Dhyana Apr 24 00:07:15 user nova-compute[71205]: Cooperlake Apr 24 00:07:15 user nova-compute[71205]: Conroe Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Broadwell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell Apr 24 00:07:15 user nova-compute[71205]: 486 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: file Apr 24 00:07:15 user nova-compute[71205]: anonymous Apr 24 00:07:15 user nova-compute[71205]: memfd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: disk Apr 24 00:07:15 user nova-compute[71205]: cdrom Apr 24 00:07:15 user nova-compute[71205]: floppy Apr 24 00:07:15 user nova-compute[71205]: lun Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: fdc Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: sata Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: sdl Apr 24 00:07:15 user nova-compute[71205]: vnc Apr 24 00:07:15 user nova-compute[71205]: spice Apr 24 00:07:15 user nova-compute[71205]: egl-headless Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: subsystem Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: default Apr 24 00:07:15 user nova-compute[71205]: mandatory Apr 24 00:07:15 user nova-compute[71205]: requisite Apr 24 00:07:15 user nova-compute[71205]: optional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: pci Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: random Apr 24 00:07:15 user nova-compute[71205]: egd Apr 24 00:07:15 user nova-compute[71205]: builtin Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: path Apr 24 00:07:15 user nova-compute[71205]: handle Apr 24 00:07:15 user nova-compute[71205]: virtiofs Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: tpm-tis Apr 24 00:07:15 user nova-compute[71205]: tpm-crb Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: passthrough Apr 24 00:07:15 user nova-compute[71205]: emulator Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: {{(pid=71205) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/bin/qemu-system-x86_64 Apr 24 00:07:15 user nova-compute[71205]: kvm Apr 24 00:07:15 user nova-compute[71205]: pc-i440fx-6.2 Apr 24 00:07:15 user nova-compute[71205]: x86_64 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: efi Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: rom Apr 24 00:07:15 user nova-compute[71205]: pflash Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: yes Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: no Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: on Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: on Apr 24 00:07:15 user nova-compute[71205]: off Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: Intel Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: qemu64 Apr 24 00:07:15 user nova-compute[71205]: qemu32 Apr 24 00:07:15 user nova-compute[71205]: phenom Apr 24 00:07:15 user nova-compute[71205]: pentium3 Apr 24 00:07:15 user nova-compute[71205]: pentium2 Apr 24 00:07:15 user nova-compute[71205]: pentium Apr 24 00:07:15 user nova-compute[71205]: n270 Apr 24 00:07:15 user nova-compute[71205]: kvm64 Apr 24 00:07:15 user nova-compute[71205]: kvm32 Apr 24 00:07:15 user nova-compute[71205]: coreduo Apr 24 00:07:15 user nova-compute[71205]: core2duo Apr 24 00:07:15 user nova-compute[71205]: athlon Apr 24 00:07:15 user nova-compute[71205]: Westmere-IBRS Apr 24 00:07:15 user nova-compute[71205]: Westmere Apr 24 00:07:15 user nova-compute[71205]: Snowridge Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Server Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client-IBRS Apr 24 00:07:15 user nova-compute[71205]: Skylake-Client Apr 24 00:07:15 user nova-compute[71205]: SandyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: SandyBridge Apr 24 00:07:15 user nova-compute[71205]: Penryn Apr 24 00:07:15 user nova-compute[71205]: Opteron_G5 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G4 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G3 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G2 Apr 24 00:07:15 user nova-compute[71205]: Opteron_G1 Apr 24 00:07:15 user nova-compute[71205]: Nehalem-IBRS Apr 24 00:07:15 user nova-compute[71205]: Nehalem Apr 24 00:07:15 user nova-compute[71205]: IvyBridge-IBRS Apr 24 00:07:15 user nova-compute[71205]: IvyBridge Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Server Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client-noTSX Apr 24 00:07:15 user nova-compute[71205]: Icelake-Client Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Haswell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Haswell Apr 24 00:07:15 user nova-compute[71205]: EPYC-Rome Apr 24 00:07:15 user nova-compute[71205]: EPYC-Milan Apr 24 00:07:15 user nova-compute[71205]: EPYC-IBPB Apr 24 00:07:15 user nova-compute[71205]: EPYC Apr 24 00:07:15 user nova-compute[71205]: Dhyana Apr 24 00:07:15 user nova-compute[71205]: Cooperlake Apr 24 00:07:15 user nova-compute[71205]: Conroe Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server-noTSX Apr 24 00:07:15 user nova-compute[71205]: Cascadelake-Server Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell-noTSX Apr 24 00:07:15 user nova-compute[71205]: Broadwell-IBRS Apr 24 00:07:15 user nova-compute[71205]: Broadwell Apr 24 00:07:15 user nova-compute[71205]: 486 Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: file Apr 24 00:07:15 user nova-compute[71205]: anonymous Apr 24 00:07:15 user nova-compute[71205]: memfd Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: disk Apr 24 00:07:15 user nova-compute[71205]: cdrom Apr 24 00:07:15 user nova-compute[71205]: floppy Apr 24 00:07:15 user nova-compute[71205]: lun Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: ide Apr 24 00:07:15 user nova-compute[71205]: fdc Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: sata Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: sdl Apr 24 00:07:15 user nova-compute[71205]: vnc Apr 24 00:07:15 user nova-compute[71205]: spice Apr 24 00:07:15 user nova-compute[71205]: egl-headless Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: subsystem Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: default Apr 24 00:07:15 user nova-compute[71205]: mandatory Apr 24 00:07:15 user nova-compute[71205]: requisite Apr 24 00:07:15 user nova-compute[71205]: optional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: usb Apr 24 00:07:15 user nova-compute[71205]: pci Apr 24 00:07:15 user nova-compute[71205]: scsi Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: virtio Apr 24 00:07:15 user nova-compute[71205]: virtio-transitional Apr 24 00:07:15 user nova-compute[71205]: virtio-non-transitional Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: random Apr 24 00:07:15 user nova-compute[71205]: egd Apr 24 00:07:15 user nova-compute[71205]: builtin Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: path Apr 24 00:07:15 user nova-compute[71205]: handle Apr 24 00:07:15 user nova-compute[71205]: virtiofs Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: tpm-tis Apr 24 00:07:15 user nova-compute[71205]: tpm-crb Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: passthrough Apr 24 00:07:15 user nova-compute[71205]: emulator Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: {{(pid=71205) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for xtensa via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch xtensa / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensa' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Getting domain capabilities for xtensaeb via machine types: {None} {{(pid=71205) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Error from libvirt when retrieving domain capabilities for arch xtensaeb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensaeb' on this host {{(pid=71205) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Checking secure boot support for host arch (x86_64) {{(pid=71205) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 24 00:07:15 user nova-compute[71205]: INFO nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Secure Boot support detected Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] cpu compare xml: Apr 24 00:07:15 user nova-compute[71205]: Nehalem Apr 24 00:07:15 user nova-compute[71205]: Apr 24 00:07:15 user nova-compute[71205]: {{(pid=71205) _compare_cpu /opt/stack/nova/nova/virt/libvirt/driver.py:9996}} Apr 24 00:07:15 user nova-compute[71205]: INFO nova.virt.node [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Generated node identity 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 Apr 24 00:07:15 user nova-compute[71205]: INFO nova.virt.node [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Wrote node identity 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 to /opt/stack/data/nova/compute_id Apr 24 00:07:15 user nova-compute[71205]: WARNING nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Compute nodes ['67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4'] for host user were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. Apr 24 00:07:15 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host Apr 24 00:07:15 user nova-compute[71205]: WARNING nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] No compute node record found for host user. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 24 00:07:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:07:15 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:07:15 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:07:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Hypervisor/Node resource view: name=user free_ram=10854MB free_disk=27.12004852294922GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:07:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:07:15 user nova-compute[71205]: WARNING nova.compute.resource_tracker [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] No compute node record for user:67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 could not be found. Apr 24 00:07:15 user nova-compute[71205]: INFO nova.compute.resource_tracker [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Compute node record created for user:user with uuid: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:07:16 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [req-67d6d2fd-c72d-4bf1-8762-b82638b93610] Created resource provider record via placement API for resource provider with UUID 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 and name user. Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] /sys/module/kvm_amd/parameters/sev does not exist {{(pid=71205) _kernel_supports_amd_sev /opt/stack/nova/nova/virt/libvirt/host.py:1766}} Apr 24 00:07:16 user nova-compute[71205]: INFO nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] kernel doesn't support AMD SEV Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Updating inventory in ProviderTree for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 with inventory: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Libvirt baseline CPU Apr 24 00:07:16 user nova-compute[71205]: x86_64 Apr 24 00:07:16 user nova-compute[71205]: Nehalem Apr 24 00:07:16 user nova-compute[71205]: Intel Apr 24 00:07:16 user nova-compute[71205]: Apr 24 00:07:16 user nova-compute[71205]: Apr 24 00:07:16 user nova-compute[71205]: {{(pid=71205) _get_guest_baseline_cpu_features /opt/stack/nova/nova/virt/libvirt/driver.py:12486}} Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Updated inventory for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Updating resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 generation from 0 to 1 during operation: update_inventory {{(pid=71205) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Updating inventory in ProviderTree for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 with inventory: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Updating resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 generation from 1 to 2 during operation: update_traits {{(pid=71205) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:07:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.638s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.service [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Creating RPC server for service compute {{(pid=71205) start /opt/stack/nova/nova/service.py:182}} Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.service [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Join ServiceGroup membership for this service compute {{(pid=71205) start /opt/stack/nova/nova/service.py:199}} Apr 24 00:07:16 user nova-compute[71205]: DEBUG nova.servicegroup.drivers.db [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] DB_Driver: join new ServiceGroup member user to the compute group, service = {{(pid=71205) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} Apr 24 00:07:48 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:07:48 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Didn't find any instances for network info cache update. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:08:07 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:08:07 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:08:07 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=10211MB free_disk=27.033802032470703GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:08:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:09:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:09:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:09:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:09:08 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:09:08 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:09:08 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:09:08 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Didn't find any instances for network info cache update. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 24 00:09:08 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:09:08 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:09:08 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:09:09 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:09:09 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:09:09 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=10218MB free_disk=27.076557159423828GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:09:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:10:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:10:08 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:10:09 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:10:09 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:10:09 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=10187MB free_disk=26.85647201538086GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:10:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:10:10 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:10:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:10:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:10:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Didn't find any instances for network info cache update. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 24 00:10:10 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:10:10 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:10:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Acquiring lock "dce8722e-982a-458a-9efb-59d08a5717c7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "dce8722e-982a-458a-9efb-59d08a5717c7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.003s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG nova.compute.manager [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:11:04 user nova-compute[71205]: INFO nova.compute.claims [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Claim successful on node user Apr 24 00:11:04 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG nova.compute.manager [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG nova.compute.manager [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG nova.network.neutron [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:11:04 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:11:04 user nova-compute[71205]: DEBUG nova.compute.manager [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG nova.compute.manager [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:11:04 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Creating image(s) Apr 24 00:11:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Acquiring lock "/opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "/opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "/opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:05 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8.part --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:05 user nova-compute[71205]: DEBUG nova.policy [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31621d02dc5143689c0d8c1280479fdc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4ebbe37ffda44c76a78244b3928f809b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:11:05 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8.part --force-share --output=json" returned: 0 in 0.126s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:05 user nova-compute[71205]: DEBUG nova.virt.images [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] fcf09ead-c5af-40cc-b5cf-92626e181ef9 was qcow2, converting to raw {{(pid=71205) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 24 00:11:05 user nova-compute[71205]: DEBUG nova.privsep.utils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71205) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 24 00:11:05 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8.part /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8.converted {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:05 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8.part /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8.converted" returned: 0 in 0.172s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:05 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8.converted --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:05 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8.converted --force-share --output=json" returned: 0 in 0.110s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.710s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:05 user nova-compute[71205]: INFO oslo.privsep.daemon [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmptvz317ru/privsep.sock'] Apr 24 00:11:05 user sudo[80267]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context nova.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmptvz317ru/privsep.sock Apr 24 00:11:05 user sudo[80267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 24 00:11:07 user sudo[80267]: pam_unix(sudo:session): session closed for user root Apr 24 00:11:07 user nova-compute[71205]: INFO oslo.privsep.daemon [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Spawned new privsep daemon via rootwrap Apr 24 00:11:07 user nova-compute[71205]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 24 00:11:07 user nova-compute[71205]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 24 00:11:07 user nova-compute[71205]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none Apr 24 00:11:07 user nova-compute[71205]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80271 Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.118s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.121s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/disk 1073741824" returned: 0 in 0.046s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.172s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG nova.network.neutron [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Successfully created port: caba6f96-07db-411c-a38b-86be3bb1c71a {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.115s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Checking if we can resize image /opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Cannot resize image /opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG nova.objects.instance [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lazy-loading 'migration_context' on Instance uuid dce8722e-982a-458a-9efb-59d08a5717c7 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Ensure instance console log exists: /opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:08 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:11:09 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:11:09 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:11:09 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:11:09 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Skipping network cache update for instance because it is Building. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9805}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Didn't find any instances for network info cache update. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:11:10 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:10 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:10 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9370MB free_disk=26.847278594970703GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance dce8722e-982a-458a-9efb-59d08a5717c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:11:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.222s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:11 user nova-compute[71205]: DEBUG nova.network.neutron [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Successfully updated port: caba6f96-07db-411c-a38b-86be3bb1c71a {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:11:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Acquiring lock "refresh_cache-dce8722e-982a-458a-9efb-59d08a5717c7" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Acquired lock "refresh_cache-dce8722e-982a-458a-9efb-59d08a5717c7" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:11 user nova-compute[71205]: DEBUG nova.network.neutron [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:11:11 user nova-compute[71205]: DEBUG nova.network.neutron [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:11:11 user nova-compute[71205]: DEBUG nova.compute.manager [req-bf9a1031-7c45-472c-af2c-24b0421e6853 req-c9632da0-3789-4e13-858a-5ea4889b0adc service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Received event network-changed-caba6f96-07db-411c-a38b-86be3bb1c71a {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:11 user nova-compute[71205]: DEBUG nova.compute.manager [req-bf9a1031-7c45-472c-af2c-24b0421e6853 req-c9632da0-3789-4e13-858a-5ea4889b0adc service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Refreshing instance network info cache due to event network-changed-caba6f96-07db-411c-a38b-86be3bb1c71a. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:11:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf9a1031-7c45-472c-af2c-24b0421e6853 req-c9632da0-3789-4e13-858a-5ea4889b0adc service nova] Acquiring lock "refresh_cache-dce8722e-982a-458a-9efb-59d08a5717c7" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:11 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.network.neutron [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Updating instance_info_cache with network_info: [{"id": "caba6f96-07db-411c-a38b-86be3bb1c71a", "address": "fa:16:3e:f9:0a:e2", "network": {"id": "6bffe37b-19f2-4b15-8425-82fe8d0b0c77", "bridge": "br-int", "label": "tempest-VolumesActionsTest-258539693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "4ebbe37ffda44c76a78244b3928f809b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba6f96-07", "ovs_interfaceid": "caba6f96-07db-411c-a38b-86be3bb1c71a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Releasing lock "refresh_cache-dce8722e-982a-458a-9efb-59d08a5717c7" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.compute.manager [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Instance network_info: |[{"id": "caba6f96-07db-411c-a38b-86be3bb1c71a", "address": "fa:16:3e:f9:0a:e2", "network": {"id": "6bffe37b-19f2-4b15-8425-82fe8d0b0c77", "bridge": "br-int", "label": "tempest-VolumesActionsTest-258539693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "4ebbe37ffda44c76a78244b3928f809b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba6f96-07", "ovs_interfaceid": "caba6f96-07db-411c-a38b-86be3bb1c71a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf9a1031-7c45-472c-af2c-24b0421e6853 req-c9632da0-3789-4e13-858a-5ea4889b0adc service nova] Acquired lock "refresh_cache-dce8722e-982a-458a-9efb-59d08a5717c7" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.network.neutron [req-bf9a1031-7c45-472c-af2c-24b0421e6853 req-c9632da0-3789-4e13-858a-5ea4889b0adc service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Refreshing network info cache for port caba6f96-07db-411c-a38b-86be3bb1c71a {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Start _get_guest_xml network_info=[{"id": "caba6f96-07db-411c-a38b-86be3bb1c71a", "address": "fa:16:3e:f9:0a:e2", "network": {"id": "6bffe37b-19f2-4b15-8425-82fe8d0b0c77", "bridge": "br-int", "label": "tempest-VolumesActionsTest-258539693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "4ebbe37ffda44c76a78244b3928f809b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba6f96-07", "ovs_interfaceid": "caba6f96-07db-411c-a38b-86be3bb1c71a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:11:13 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:13 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.privsep.utils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71205) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-285922937',display_name='tempest-VolumesActionsTest-instance-285922937',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-285922937',id=1,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4ebbe37ffda44c76a78244b3928f809b',ramdisk_id='',reservation_id='r-nr0b3uav',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1689103952',owner_user_name='tempest-VolumesActionsTest-1689103952-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:05Z,user_data=None,user_id='31621d02dc5143689c0d8c1280479fdc',uuid=dce8722e-982a-458a-9efb-59d08a5717c7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "caba6f96-07db-411c-a38b-86be3bb1c71a", "address": "fa:16:3e:f9:0a:e2", "network": {"id": "6bffe37b-19f2-4b15-8425-82fe8d0b0c77", "bridge": "br-int", "label": "tempest-VolumesActionsTest-258539693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "4ebbe37ffda44c76a78244b3928f809b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba6f96-07", "ovs_interfaceid": "caba6f96-07db-411c-a38b-86be3bb1c71a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Converting VIF {"id": "caba6f96-07db-411c-a38b-86be3bb1c71a", "address": "fa:16:3e:f9:0a:e2", "network": {"id": "6bffe37b-19f2-4b15-8425-82fe8d0b0c77", "bridge": "br-int", "label": "tempest-VolumesActionsTest-258539693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "4ebbe37ffda44c76a78244b3928f809b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba6f96-07", "ovs_interfaceid": "caba6f96-07db-411c-a38b-86be3bb1c71a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0a:e2,bridge_name='br-int',has_traffic_filtering=True,id=caba6f96-07db-411c-a38b-86be3bb1c71a,network=Network(6bffe37b-19f2-4b15-8425-82fe8d0b0c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba6f96-07') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.objects.instance [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lazy-loading 'pci_devices' on Instance uuid dce8722e-982a-458a-9efb-59d08a5717c7 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] End _get_guest_xml xml= Apr 24 00:11:13 user nova-compute[71205]: dce8722e-982a-458a-9efb-59d08a5717c7 Apr 24 00:11:13 user nova-compute[71205]: instance-00000001 Apr 24 00:11:13 user nova-compute[71205]: 131072 Apr 24 00:11:13 user nova-compute[71205]: 1 Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: tempest-VolumesActionsTest-instance-285922937 Apr 24 00:11:13 user nova-compute[71205]: 2023-04-24 00:11:13 Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: 128 Apr 24 00:11:13 user nova-compute[71205]: 1 Apr 24 00:11:13 user nova-compute[71205]: 0 Apr 24 00:11:13 user nova-compute[71205]: 0 Apr 24 00:11:13 user nova-compute[71205]: 1 Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: tempest-VolumesActionsTest-1689103952-project-member Apr 24 00:11:13 user nova-compute[71205]: tempest-VolumesActionsTest-1689103952 Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: OpenStack Foundation Apr 24 00:11:13 user nova-compute[71205]: OpenStack Nova Apr 24 00:11:13 user nova-compute[71205]: 0.0.0 Apr 24 00:11:13 user nova-compute[71205]: dce8722e-982a-458a-9efb-59d08a5717c7 Apr 24 00:11:13 user nova-compute[71205]: dce8722e-982a-458a-9efb-59d08a5717c7 Apr 24 00:11:13 user nova-compute[71205]: Virtual Machine Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: hvm Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Nehalem Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: /dev/urandom Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: Apr 24 00:11:13 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-285922937',display_name='tempest-VolumesActionsTest-instance-285922937',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-285922937',id=1,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4ebbe37ffda44c76a78244b3928f809b',ramdisk_id='',reservation_id='r-nr0b3uav',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1689103952',owner_user_name='tempest-VolumesActionsTest-1689103952-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:05Z,user_data=None,user_id='31621d02dc5143689c0d8c1280479fdc',uuid=dce8722e-982a-458a-9efb-59d08a5717c7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "caba6f96-07db-411c-a38b-86be3bb1c71a", "address": "fa:16:3e:f9:0a:e2", "network": {"id": "6bffe37b-19f2-4b15-8425-82fe8d0b0c77", "bridge": "br-int", "label": "tempest-VolumesActionsTest-258539693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "4ebbe37ffda44c76a78244b3928f809b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba6f96-07", "ovs_interfaceid": "caba6f96-07db-411c-a38b-86be3bb1c71a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Converting VIF {"id": "caba6f96-07db-411c-a38b-86be3bb1c71a", "address": "fa:16:3e:f9:0a:e2", "network": {"id": "6bffe37b-19f2-4b15-8425-82fe8d0b0c77", "bridge": "br-int", "label": "tempest-VolumesActionsTest-258539693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "4ebbe37ffda44c76a78244b3928f809b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba6f96-07", "ovs_interfaceid": "caba6f96-07db-411c-a38b-86be3bb1c71a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0a:e2,bridge_name='br-int',has_traffic_filtering=True,id=caba6f96-07db-411c-a38b-86be3bb1c71a,network=Network(6bffe37b-19f2-4b15-8425-82fe8d0b0c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba6f96-07') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG os_vif [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0a:e2,bridge_name='br-int',has_traffic_filtering=True,id=caba6f96-07db-411c-a38b-86be3bb1c71a,network=Network(6bffe37b-19f2-4b15-8425-82fe8d0b0c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba6f96-07') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Created schema index Interface.name {{(pid=71205) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Created schema index Port.name {{(pid=71205) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Created schema index Bridge.name {{(pid=71205) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] tcp:127.0.0.1:6640: entering CONNECTING {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [POLLOUT] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:11:13 user nova-compute[71205]: INFO oslo.privsep.daemon [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp44p5ljn5/privsep.sock'] Apr 24 00:11:13 user sudo[80295]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmp44p5ljn5/privsep.sock Apr 24 00:11:13 user sudo[80295]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 24 00:11:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:15 user sudo[80295]: pam_unix(sudo:session): session closed for user root Apr 24 00:11:15 user nova-compute[71205]: INFO oslo.privsep.daemon [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Spawned new privsep daemon via rootwrap Apr 24 00:11:15 user nova-compute[71205]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 24 00:11:15 user nova-compute[71205]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 24 00:11:15 user nova-compute[71205]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none Apr 24 00:11:15 user nova-compute[71205]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80302 Apr 24 00:11:15 user nova-compute[71205]: DEBUG nova.network.neutron [req-bf9a1031-7c45-472c-af2c-24b0421e6853 req-c9632da0-3789-4e13-858a-5ea4889b0adc service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Updated VIF entry in instance network info cache for port caba6f96-07db-411c-a38b-86be3bb1c71a. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:11:15 user nova-compute[71205]: DEBUG nova.network.neutron [req-bf9a1031-7c45-472c-af2c-24b0421e6853 req-c9632da0-3789-4e13-858a-5ea4889b0adc service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Updating instance_info_cache with network_info: [{"id": "caba6f96-07db-411c-a38b-86be3bb1c71a", "address": "fa:16:3e:f9:0a:e2", "network": {"id": "6bffe37b-19f2-4b15-8425-82fe8d0b0c77", "bridge": "br-int", "label": "tempest-VolumesActionsTest-258539693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "4ebbe37ffda44c76a78244b3928f809b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba6f96-07", "ovs_interfaceid": "caba6f96-07db-411c-a38b-86be3bb1c71a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf9a1031-7c45-472c-af2c-24b0421e6853 req-c9632da0-3789-4e13-858a-5ea4889b0adc service nova] Releasing lock "refresh_cache-dce8722e-982a-458a-9efb-59d08a5717c7" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:15 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:15 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcaba6f96-07, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:15 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcaba6f96-07, col_values=(('external_ids', {'iface-id': 'caba6f96-07db-411c-a38b-86be3bb1c71a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:0a:e2', 'vm-uuid': 'dce8722e-982a-458a-9efb-59d08a5717c7'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:15 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:15 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:11:15 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:15 user nova-compute[71205]: INFO os_vif [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:0a:e2,bridge_name='br-int',has_traffic_filtering=True,id=caba6f96-07db-411c-a38b-86be3bb1c71a,network=Network(6bffe37b-19f2-4b15-8425-82fe8d0b0c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba6f96-07') Apr 24 00:11:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:11:15 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] No VIF found with MAC fa:16:3e:f9:0a:e2, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:11:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:11:18 user nova-compute[71205]: INFO nova.compute.claims [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Claim successful on node user Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.compute.manager [req-6241e159-5ec0-4f10-8e56-bd7c612f0c87 req-5e024613-fc34-4b1b-8d26-a947682ef00c service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Received event network-vif-plugged-caba6f96-07db-411c-a38b-86be3bb1c71a {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-6241e159-5ec0-4f10-8e56-bd7c612f0c87 req-5e024613-fc34-4b1b-8d26-a947682ef00c service nova] Acquiring lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-6241e159-5ec0-4f10-8e56-bd7c612f0c87 req-5e024613-fc34-4b1b-8d26-a947682ef00c service nova] Lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-6241e159-5ec0-4f10-8e56-bd7c612f0c87 req-5e024613-fc34-4b1b-8d26-a947682ef00c service nova] Lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.compute.manager [req-6241e159-5ec0-4f10-8e56-bd7c612f0c87 req-5e024613-fc34-4b1b-8d26-a947682ef00c service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] No waiting events found dispatching network-vif-plugged-caba6f96-07db-411c-a38b-86be3bb1c71a {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:18 user nova-compute[71205]: WARNING nova.compute.manager [req-6241e159-5ec0-4f10-8e56-bd7c612f0c87 req-5e024613-fc34-4b1b-8d26-a947682ef00c service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Received unexpected event network-vif-plugged-caba6f96-07db-411c-a38b-86be3bb1c71a for instance with vm_state building and task_state spawning. Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.network.neutron [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:11:18 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:11:18 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Creating image(s) Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "/opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "/opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "/opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.179s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.005s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:18 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] VM Resumed (Lifecycle Event) Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:11:18 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Instance spawned successfully. Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:18 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:18 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] VM Started (Lifecycle Event) Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.147s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.policy [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '471d341199f0431a95ae54651c4f0780', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d063c2bdc884fb8b826b9fb6fd97405', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:11:18 user nova-compute[71205]: INFO nova.compute.manager [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Took 13.96 seconds to spawn the instance on the hypervisor. Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/disk 1073741824" returned: 0 in 0.073s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.227s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:19 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:11:19 user nova-compute[71205]: INFO nova.compute.manager [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Took 14.78 seconds to build instance. Apr 24 00:11:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.153s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:19 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Checking if we can resize image /opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:11:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-dc9260c5-323b-4676-a3ef-72b72fbc415d tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "dce8722e-982a-458a-9efb-59d08a5717c7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.993s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:19 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Cannot resize image /opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:11:19 user nova-compute[71205]: DEBUG nova.objects.instance [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lazy-loading 'migration_context' on Instance uuid 212a2ad6-77ab-4615-b6ca-f426a3e76ab5 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:11:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Ensure instance console log exists: /opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:11:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Acquiring lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:20 user nova-compute[71205]: DEBUG nova.compute.manager [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:11:20 user nova-compute[71205]: DEBUG nova.compute.manager [req-0d8f23ce-d311-4b4f-a69a-664c32644a6e req-1fc19b33-c37a-428b-8732-fbde14bcca85 service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Received event network-vif-plugged-caba6f96-07db-411c-a38b-86be3bb1c71a {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0d8f23ce-d311-4b4f-a69a-664c32644a6e req-1fc19b33-c37a-428b-8732-fbde14bcca85 service nova] Acquiring lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0d8f23ce-d311-4b4f-a69a-664c32644a6e req-1fc19b33-c37a-428b-8732-fbde14bcca85 service nova] Lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0d8f23ce-d311-4b4f-a69a-664c32644a6e req-1fc19b33-c37a-428b-8732-fbde14bcca85 service nova] Lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:20 user nova-compute[71205]: DEBUG nova.compute.manager [req-0d8f23ce-d311-4b4f-a69a-664c32644a6e req-1fc19b33-c37a-428b-8732-fbde14bcca85 service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] No waiting events found dispatching network-vif-plugged-caba6f96-07db-411c-a38b-86be3bb1c71a {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:20 user nova-compute[71205]: WARNING nova.compute.manager [req-0d8f23ce-d311-4b4f-a69a-664c32644a6e req-1fc19b33-c37a-428b-8732-fbde14bcca85 service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Received unexpected event network-vif-plugged-caba6f96-07db-411c-a38b-86be3bb1c71a for instance with vm_state active and task_state None. Apr 24 00:11:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:20 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:11:20 user nova-compute[71205]: INFO nova.compute.claims [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Claim successful on node user Apr 24 00:11:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.562s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG nova.compute.manager [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG nova.compute.manager [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG nova.network.neutron [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:11:21 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:11:21 user nova-compute[71205]: DEBUG nova.compute.manager [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG nova.compute.manager [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:11:21 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Creating image(s) Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Acquiring lock "/opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "/opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "/opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.158s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.009s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG nova.policy [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9cf55386d2654a1e86af6882a2aed860', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '218aba2df07b4afaa999399d0981e6bf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.163s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/disk 1073741824" returned: 0 in 0.055s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.224s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.147s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Checking if we can resize image /opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:11:21 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/disk --force-share --output=json" returned: 0 in 0.156s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Cannot resize image /opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG nova.objects.instance [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lazy-loading 'migration_context' on Instance uuid ffbf17ce-e3cb-4099-bea3-6887fef4476d {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Ensure instance console log exists: /opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:11:22 user nova-compute[71205]: INFO nova.compute.claims [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Claim successful on node user Apr 24 00:11:22 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.329s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG nova.network.neutron [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:11:23 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:11:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:11:23 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Creating image(s) Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "/opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "/opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "/opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.151s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.155s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk 1073741824" returned: 0 in 0.059s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.220s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG nova.policy [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'abae98323deb44dea0622186485cc7af', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ce75f63fc0904eceb03e8319bddba4d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.143s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Checking if we can resize image /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Cannot resize image /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG nova.objects.instance [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lazy-loading 'migration_context' on Instance uuid ce19423d-a6ee-4506-9cd1-ec4803abdd86 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Ensure instance console log exists: /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:23 user nova-compute[71205]: DEBUG nova.network.neutron [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Successfully created port: 81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:11:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:24 user nova-compute[71205]: DEBUG nova.network.neutron [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Successfully created port: be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:11:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "e762c863-43e1-4f26-ab6b-c8ea40f08887" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:24 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:11:25 user nova-compute[71205]: INFO nova.compute.claims [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Claim successful on node user Apr 24 00:11:25 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.310s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:11:25 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:11:25 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:11:25 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Creating image(s) Apr 24 00:11:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "/opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "/opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "/opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.152s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:25 user nova-compute[71205]: DEBUG nova.policy [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '539997e65f4f4ef7998a4386d19a5e9f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e2bf3154181247f8963be8cd31399851', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.165s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk 1073741824" returned: 0 in 0.047s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.217s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.146s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Checking if we can resize image /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Cannot resize image /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG nova.objects.instance [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lazy-loading 'migration_context' on Instance uuid e762c863-43e1-4f26-ab6b-c8ea40f08887 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Ensure instance console log exists: /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG nova.network.neutron [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Successfully updated port: 81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "refresh_cache-212a2ad6-77ab-4615-b6ca-f426a3e76ab5" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquired lock "refresh_cache-212a2ad6-77ab-4615-b6ca-f426a3e76ab5" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG nova.network.neutron [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG nova.compute.manager [req-f787d755-cbac-4490-b486-5ed0a739f7e9 req-fddc10d7-7d44-4c57-8806-7d9149fe8d30 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Received event network-changed-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG nova.compute.manager [req-f787d755-cbac-4490-b486-5ed0a739f7e9 req-fddc10d7-7d44-4c57-8806-7d9149fe8d30 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Refreshing instance network info cache due to event network-changed-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f787d755-cbac-4490-b486-5ed0a739f7e9 req-fddc10d7-7d44-4c57-8806-7d9149fe8d30 service nova] Acquiring lock "refresh_cache-212a2ad6-77ab-4615-b6ca-f426a3e76ab5" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG nova.network.neutron [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:11:26 user nova-compute[71205]: DEBUG nova.network.neutron [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Successfully created port: f6c98734-17ee-42d0-9372-fd76526b0b27 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.network.neutron [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Updating instance_info_cache with network_info: [{"id": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "address": "fa:16:3e:32:fe:fb", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap81bcd0fd-3b", "ovs_interfaceid": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Releasing lock "refresh_cache-212a2ad6-77ab-4615-b6ca-f426a3e76ab5" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.compute.manager [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Instance network_info: |[{"id": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "address": "fa:16:3e:32:fe:fb", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap81bcd0fd-3b", "ovs_interfaceid": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f787d755-cbac-4490-b486-5ed0a739f7e9 req-fddc10d7-7d44-4c57-8806-7d9149fe8d30 service nova] Acquired lock "refresh_cache-212a2ad6-77ab-4615-b6ca-f426a3e76ab5" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.network.neutron [req-f787d755-cbac-4490-b486-5ed0a739f7e9 req-fddc10d7-7d44-4c57-8806-7d9149fe8d30 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Refreshing network info cache for port 81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Start _get_guest_xml network_info=[{"id": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "address": "fa:16:3e:32:fe:fb", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap81bcd0fd-3b", "ovs_interfaceid": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:11:27 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:27 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1695303135',display_name='tempest-AttachVolumeTestJSON-server-1695303135',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1695303135',id=2,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCP71ilDAgiarKoZWp2VpgncCzSb29zFpexe4Gow4OMeIBbuSeM19Qy9FpbyZ23mx7wcJNC4TUUIImLZa0Jkxw/4VzByhN1LXhR6rqRIHWomLMjZmJ53RbDWMdufdl+oQ==',key_name='tempest-keypair-1814203803',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d063c2bdc884fb8b826b9fb6fd97405',ramdisk_id='',reservation_id='r-dagqmnao',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1425553791',owner_user_name='tempest-AttachVolumeTestJSON-1425553791-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='471d341199f0431a95ae54651c4f0780',uuid=212a2ad6-77ab-4615-b6ca-f426a3e76ab5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "address": "fa:16:3e:32:fe:fb", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap81bcd0fd-3b", "ovs_interfaceid": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Converting VIF {"id": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "address": "fa:16:3e:32:fe:fb", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap81bcd0fd-3b", "ovs_interfaceid": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:fe:fb,bridge_name='br-int',has_traffic_filtering=True,id=81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81bcd0fd-3b') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.objects.instance [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lazy-loading 'pci_devices' on Instance uuid 212a2ad6-77ab-4615-b6ca-f426a3e76ab5 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] End _get_guest_xml xml= Apr 24 00:11:27 user nova-compute[71205]: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5 Apr 24 00:11:27 user nova-compute[71205]: instance-00000002 Apr 24 00:11:27 user nova-compute[71205]: 131072 Apr 24 00:11:27 user nova-compute[71205]: 1 Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: tempest-AttachVolumeTestJSON-server-1695303135 Apr 24 00:11:27 user nova-compute[71205]: 2023-04-24 00:11:27 Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: 128 Apr 24 00:11:27 user nova-compute[71205]: 1 Apr 24 00:11:27 user nova-compute[71205]: 0 Apr 24 00:11:27 user nova-compute[71205]: 0 Apr 24 00:11:27 user nova-compute[71205]: 1 Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: tempest-AttachVolumeTestJSON-1425553791-project-member Apr 24 00:11:27 user nova-compute[71205]: tempest-AttachVolumeTestJSON-1425553791 Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: OpenStack Foundation Apr 24 00:11:27 user nova-compute[71205]: OpenStack Nova Apr 24 00:11:27 user nova-compute[71205]: 0.0.0 Apr 24 00:11:27 user nova-compute[71205]: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5 Apr 24 00:11:27 user nova-compute[71205]: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5 Apr 24 00:11:27 user nova-compute[71205]: Virtual Machine Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: hvm Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Nehalem Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: /dev/urandom Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: Apr 24 00:11:27 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1695303135',display_name='tempest-AttachVolumeTestJSON-server-1695303135',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1695303135',id=2,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCP71ilDAgiarKoZWp2VpgncCzSb29zFpexe4Gow4OMeIBbuSeM19Qy9FpbyZ23mx7wcJNC4TUUIImLZa0Jkxw/4VzByhN1LXhR6rqRIHWomLMjZmJ53RbDWMdufdl+oQ==',key_name='tempest-keypair-1814203803',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d063c2bdc884fb8b826b9fb6fd97405',ramdisk_id='',reservation_id='r-dagqmnao',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1425553791',owner_user_name='tempest-AttachVolumeTestJSON-1425553791-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='471d341199f0431a95ae54651c4f0780',uuid=212a2ad6-77ab-4615-b6ca-f426a3e76ab5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "address": "fa:16:3e:32:fe:fb", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap81bcd0fd-3b", "ovs_interfaceid": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Converting VIF {"id": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "address": "fa:16:3e:32:fe:fb", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap81bcd0fd-3b", "ovs_interfaceid": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:fe:fb,bridge_name='br-int',has_traffic_filtering=True,id=81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81bcd0fd-3b') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG os_vif [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:fe:fb,bridge_name='br-int',has_traffic_filtering=True,id=81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81bcd0fd-3b') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81bcd0fd-3b, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap81bcd0fd-3b, col_values=(('external_ids', {'iface-id': '81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:fe:fb', 'vm-uuid': '212a2ad6-77ab-4615-b6ca-f426a3e76ab5'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.compute.manager [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:27 user nova-compute[71205]: INFO os_vif [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:fe:fb,bridge_name='br-int',has_traffic_filtering=True,id=81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81bcd0fd-3b') Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] No VIF found with MAC fa:16:3e:32:fe:fb, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:11:27 user nova-compute[71205]: INFO nova.compute.claims [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Claim successful on node user Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Successfully created port: d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.network.neutron [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Successfully updated port: be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Acquiring lock "refresh_cache-ffbf17ce-e3cb-4099-bea3-6887fef4476d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Acquired lock "refresh_cache-ffbf17ce-e3cb-4099-bea3-6887fef4476d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.network.neutron [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.456s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:27 user nova-compute[71205]: DEBUG nova.compute.manager [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.compute.manager [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.network.neutron [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:11:28 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.compute.manager [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.network.neutron [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.compute.manager [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:11:28 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Creating image(s) Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "/opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "/opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "/opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.compute.manager [req-fb6f8e56-a841-4bd2-bb76-6a12a4205803 req-e1cfde75-4ca4-4f79-b22c-d072821f80d6 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received event network-changed-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.compute.manager [req-fb6f8e56-a841-4bd2-bb76-6a12a4205803 req-e1cfde75-4ca4-4f79-b22c-d072821f80d6 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Refreshing instance network info cache due to event network-changed-be0c060b-e1fe-496e-8827-a2699e8a4017. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-fb6f8e56-a841-4bd2-bb76-6a12a4205803 req-e1cfde75-4ca4-4f79-b22c-d072821f80d6 service nova] Acquiring lock "refresh_cache-ffbf17ce-e3cb-4099-bea3-6887fef4476d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.152s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.policy [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8d0ab07106dd4995aa7e3f5b6bc70e56', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd26ba1ed4b9241f9a084db1a14a945bb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.159s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk 1073741824" returned: 0 in 0.045s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.208s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.131s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Checking if we can resize image /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.189s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Cannot resize image /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.objects.instance [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lazy-loading 'migration_context' on Instance uuid c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Ensure instance console log exists: /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.network.neutron [req-f787d755-cbac-4490-b486-5ed0a739f7e9 req-fddc10d7-7d44-4c57-8806-7d9149fe8d30 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Updated VIF entry in instance network info cache for port 81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.network.neutron [req-f787d755-cbac-4490-b486-5ed0a739f7e9 req-fddc10d7-7d44-4c57-8806-7d9149fe8d30 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Updating instance_info_cache with network_info: [{"id": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "address": "fa:16:3e:32:fe:fb", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap81bcd0fd-3b", "ovs_interfaceid": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f787d755-cbac-4490-b486-5ed0a739f7e9 req-fddc10d7-7d44-4c57-8806-7d9149fe8d30 service nova] Releasing lock "refresh_cache-212a2ad6-77ab-4615-b6ca-f426a3e76ab5" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.network.neutron [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Successfully updated port: f6c98734-17ee-42d0-9372-fd76526b0b27 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "refresh_cache-ce19423d-a6ee-4506-9cd1-ec4803abdd86" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquired lock "refresh_cache-ce19423d-a6ee-4506-9cd1-ec4803abdd86" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.network.neutron [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.network.neutron [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Updating instance_info_cache with network_info: [{"id": "be0c060b-e1fe-496e-8827-a2699e8a4017", "address": "fa:16:3e:d3:61:2a", "network": {"id": "431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-800590822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "218aba2df07b4afaa999399d0981e6bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe0c060b-e1", "ovs_interfaceid": "be0c060b-e1fe-496e-8827-a2699e8a4017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Releasing lock "refresh_cache-ffbf17ce-e3cb-4099-bea3-6887fef4476d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.compute.manager [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Instance network_info: |[{"id": "be0c060b-e1fe-496e-8827-a2699e8a4017", "address": "fa:16:3e:d3:61:2a", "network": {"id": "431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-800590822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "218aba2df07b4afaa999399d0981e6bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe0c060b-e1", "ovs_interfaceid": "be0c060b-e1fe-496e-8827-a2699e8a4017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-fb6f8e56-a841-4bd2-bb76-6a12a4205803 req-e1cfde75-4ca4-4f79-b22c-d072821f80d6 service nova] Acquired lock "refresh_cache-ffbf17ce-e3cb-4099-bea3-6887fef4476d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.network.neutron [req-fb6f8e56-a841-4bd2-bb76-6a12a4205803 req-e1cfde75-4ca4-4f79-b22c-d072821f80d6 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Refreshing network info cache for port be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Start _get_guest_xml network_info=[{"id": "be0c060b-e1fe-496e-8827-a2699e8a4017", "address": "fa:16:3e:d3:61:2a", "network": {"id": "431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-800590822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "218aba2df07b4afaa999399d0981e6bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe0c060b-e1", "ovs_interfaceid": "be0c060b-e1fe-496e-8827-a2699e8a4017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:11:29 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:29 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-437284520',display_name='tempest-DeleteServersTestJSON-server-437284520',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-437284520',id=3,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='218aba2df07b4afaa999399d0981e6bf',ramdisk_id='',reservation_id='r-v6rykihc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1102950130',owner_user_name='tempest-DeleteServersTestJSON-1102950130-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:21Z,user_data=None,user_id='9cf55386d2654a1e86af6882a2aed860',uuid=ffbf17ce-e3cb-4099-bea3-6887fef4476d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be0c060b-e1fe-496e-8827-a2699e8a4017", "address": "fa:16:3e:d3:61:2a", "network": {"id": "431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-800590822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "218aba2df07b4afaa999399d0981e6bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe0c060b-e1", "ovs_interfaceid": "be0c060b-e1fe-496e-8827-a2699e8a4017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Converting VIF {"id": "be0c060b-e1fe-496e-8827-a2699e8a4017", "address": "fa:16:3e:d3:61:2a", "network": {"id": "431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-800590822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "218aba2df07b4afaa999399d0981e6bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe0c060b-e1", "ovs_interfaceid": "be0c060b-e1fe-496e-8827-a2699e8a4017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:61:2a,bridge_name='br-int',has_traffic_filtering=True,id=be0c060b-e1fe-496e-8827-a2699e8a4017,network=Network(431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe0c060b-e1') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.objects.instance [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lazy-loading 'pci_devices' on Instance uuid ffbf17ce-e3cb-4099-bea3-6887fef4476d {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.network.neutron [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] End _get_guest_xml xml= Apr 24 00:11:29 user nova-compute[71205]: ffbf17ce-e3cb-4099-bea3-6887fef4476d Apr 24 00:11:29 user nova-compute[71205]: instance-00000003 Apr 24 00:11:29 user nova-compute[71205]: 131072 Apr 24 00:11:29 user nova-compute[71205]: 1 Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: tempest-DeleteServersTestJSON-server-437284520 Apr 24 00:11:29 user nova-compute[71205]: 2023-04-24 00:11:29 Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: 128 Apr 24 00:11:29 user nova-compute[71205]: 1 Apr 24 00:11:29 user nova-compute[71205]: 0 Apr 24 00:11:29 user nova-compute[71205]: 0 Apr 24 00:11:29 user nova-compute[71205]: 1 Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: tempest-DeleteServersTestJSON-1102950130-project-member Apr 24 00:11:29 user nova-compute[71205]: tempest-DeleteServersTestJSON-1102950130 Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: OpenStack Foundation Apr 24 00:11:29 user nova-compute[71205]: OpenStack Nova Apr 24 00:11:29 user nova-compute[71205]: 0.0.0 Apr 24 00:11:29 user nova-compute[71205]: ffbf17ce-e3cb-4099-bea3-6887fef4476d Apr 24 00:11:29 user nova-compute[71205]: ffbf17ce-e3cb-4099-bea3-6887fef4476d Apr 24 00:11:29 user nova-compute[71205]: Virtual Machine Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: hvm Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Nehalem Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: /dev/urandom Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: Apr 24 00:11:29 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-437284520',display_name='tempest-DeleteServersTestJSON-server-437284520',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-437284520',id=3,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='218aba2df07b4afaa999399d0981e6bf',ramdisk_id='',reservation_id='r-v6rykihc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1102950130',owner_user_name='tempest-DeleteServersTestJSON-1102950130-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:21Z,user_data=None,user_id='9cf55386d2654a1e86af6882a2aed860',uuid=ffbf17ce-e3cb-4099-bea3-6887fef4476d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be0c060b-e1fe-496e-8827-a2699e8a4017", "address": "fa:16:3e:d3:61:2a", "network": {"id": "431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-800590822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "218aba2df07b4afaa999399d0981e6bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe0c060b-e1", "ovs_interfaceid": "be0c060b-e1fe-496e-8827-a2699e8a4017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Converting VIF {"id": "be0c060b-e1fe-496e-8827-a2699e8a4017", "address": "fa:16:3e:d3:61:2a", "network": {"id": "431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-800590822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "218aba2df07b4afaa999399d0981e6bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe0c060b-e1", "ovs_interfaceid": "be0c060b-e1fe-496e-8827-a2699e8a4017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:61:2a,bridge_name='br-int',has_traffic_filtering=True,id=be0c060b-e1fe-496e-8827-a2699e8a4017,network=Network(431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe0c060b-e1') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG os_vif [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:61:2a,bridge_name='br-int',has_traffic_filtering=True,id=be0c060b-e1fe-496e-8827-a2699e8a4017,network=Network(431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe0c060b-e1') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe0c060b-e1, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbe0c060b-e1, col_values=(('external_ids', {'iface-id': 'be0c060b-e1fe-496e-8827-a2699e8a4017', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:61:2a', 'vm-uuid': 'ffbf17ce-e3cb-4099-bea3-6887fef4476d'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:29 user nova-compute[71205]: INFO os_vif [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:61:2a,bridge_name='br-int',has_traffic_filtering=True,id=be0c060b-e1fe-496e-8827-a2699e8a4017,network=Network(431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe0c060b-e1') Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] No VIF found with MAC fa:16:3e:d3:61:2a, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Successfully updated port: d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "refresh_cache-e762c863-43e1-4f26-ab6b-c8ea40f08887" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquired lock "refresh_cache-e762c863-43e1-4f26-ab6b-c8ea40f08887" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.compute.manager [req-31f698ef-364c-4709-b077-41f752094bd1 req-9d9688b2-7f9c-4137-9f8e-113595664605 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Received event network-changed-d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG nova.compute.manager [req-31f698ef-364c-4709-b077-41f752094bd1 req-9d9688b2-7f9c-4137-9f8e-113595664605 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Refreshing instance network info cache due to event network-changed-d90cab57-5c57-410a-a61e-ab316454a676. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:11:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-31f698ef-364c-4709-b077-41f752094bd1 req-9d9688b2-7f9c-4137-9f8e-113595664605 service nova] Acquiring lock "refresh_cache-e762c863-43e1-4f26-ab6b-c8ea40f08887" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:30 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:11:30 user nova-compute[71205]: DEBUG nova.compute.manager [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Received event network-vif-plugged-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:30 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] Acquiring lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:30 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:30 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:30 user nova-compute[71205]: DEBUG nova.compute.manager [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] No waiting events found dispatching network-vif-plugged-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:30 user nova-compute[71205]: WARNING nova.compute.manager [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Received unexpected event network-vif-plugged-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 for instance with vm_state building and task_state spawning. Apr 24 00:11:30 user nova-compute[71205]: DEBUG nova.compute.manager [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Received event network-changed-f6c98734-17ee-42d0-9372-fd76526b0b27 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:30 user nova-compute[71205]: DEBUG nova.compute.manager [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Refreshing instance network info cache due to event network-changed-f6c98734-17ee-42d0-9372-fd76526b0b27. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:11:30 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] Acquiring lock "refresh_cache-ce19423d-a6ee-4506-9cd1-ec4803abdd86" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:30 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:30 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Updating instance_info_cache with network_info: [{"id": "d90cab57-5c57-410a-a61e-ab316454a676", "address": "fa:16:3e:07:f9:7b", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd90cab57-5c", "ovs_interfaceid": "d90cab57-5c57-410a-a61e-ab316454a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Releasing lock "refresh_cache-e762c863-43e1-4f26-ab6b-c8ea40f08887" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Instance network_info: |[{"id": "d90cab57-5c57-410a-a61e-ab316454a676", "address": "fa:16:3e:07:f9:7b", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd90cab57-5c", "ovs_interfaceid": "d90cab57-5c57-410a-a61e-ab316454a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-31f698ef-364c-4709-b077-41f752094bd1 req-9d9688b2-7f9c-4137-9f8e-113595664605 service nova] Acquired lock "refresh_cache-e762c863-43e1-4f26-ab6b-c8ea40f08887" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.neutron [req-31f698ef-364c-4709-b077-41f752094bd1 req-9d9688b2-7f9c-4137-9f8e-113595664605 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Refreshing network info cache for port d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Start _get_guest_xml network_info=[{"id": "d90cab57-5c57-410a-a61e-ab316454a676", "address": "fa:16:3e:07:f9:7b", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd90cab57-5c", "ovs_interfaceid": "d90cab57-5c57-410a-a61e-ab316454a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:11:31 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:31 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-952005257',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-952005257',id=5,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH/KzlzjtqauDBrnzunF6cUjlci7dQJPS3JVA4rtGDgOFyp/2HY51uowvViM2HhE4lJZuUeHsvbKt1HzFUPo9qtzdZ2uB0Xr7EHNKxzFBZkrBwvigyr3VT3tKQN0UGAShA==',key_name='tempest-keypair-573827894',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e2bf3154181247f8963be8cd31399851',ramdisk_id='',reservation_id='r-oy4uh9un',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1947115496',owner_user_name='tempest-AttachVolumeShelveTestJSON-1947115496-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='539997e65f4f4ef7998a4386d19a5e9f',uuid=e762c863-43e1-4f26-ab6b-c8ea40f08887,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d90cab57-5c57-410a-a61e-ab316454a676", "address": "fa:16:3e:07:f9:7b", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd90cab57-5c", "ovs_interfaceid": "d90cab57-5c57-410a-a61e-ab316454a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Converting VIF {"id": "d90cab57-5c57-410a-a61e-ab316454a676", "address": "fa:16:3e:07:f9:7b", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd90cab57-5c", "ovs_interfaceid": "d90cab57-5c57-410a-a61e-ab316454a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:f9:7b,bridge_name='br-int',has_traffic_filtering=True,id=d90cab57-5c57-410a-a61e-ab316454a676,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd90cab57-5c') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.objects.instance [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lazy-loading 'pci_devices' on Instance uuid e762c863-43e1-4f26-ab6b-c8ea40f08887 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] End _get_guest_xml xml= Apr 24 00:11:31 user nova-compute[71205]: e762c863-43e1-4f26-ab6b-c8ea40f08887 Apr 24 00:11:31 user nova-compute[71205]: instance-00000005 Apr 24 00:11:31 user nova-compute[71205]: 131072 Apr 24 00:11:31 user nova-compute[71205]: 1 Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: tempest-AttachVolumeShelveTestJSON-server-952005257 Apr 24 00:11:31 user nova-compute[71205]: 2023-04-24 00:11:31 Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: 128 Apr 24 00:11:31 user nova-compute[71205]: 1 Apr 24 00:11:31 user nova-compute[71205]: 0 Apr 24 00:11:31 user nova-compute[71205]: 0 Apr 24 00:11:31 user nova-compute[71205]: 1 Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: tempest-AttachVolumeShelveTestJSON-1947115496-project-member Apr 24 00:11:31 user nova-compute[71205]: tempest-AttachVolumeShelveTestJSON-1947115496 Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: OpenStack Foundation Apr 24 00:11:31 user nova-compute[71205]: OpenStack Nova Apr 24 00:11:31 user nova-compute[71205]: 0.0.0 Apr 24 00:11:31 user nova-compute[71205]: e762c863-43e1-4f26-ab6b-c8ea40f08887 Apr 24 00:11:31 user nova-compute[71205]: e762c863-43e1-4f26-ab6b-c8ea40f08887 Apr 24 00:11:31 user nova-compute[71205]: Virtual Machine Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: hvm Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Nehalem Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: /dev/urandom Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-952005257',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-952005257',id=5,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH/KzlzjtqauDBrnzunF6cUjlci7dQJPS3JVA4rtGDgOFyp/2HY51uowvViM2HhE4lJZuUeHsvbKt1HzFUPo9qtzdZ2uB0Xr7EHNKxzFBZkrBwvigyr3VT3tKQN0UGAShA==',key_name='tempest-keypair-573827894',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e2bf3154181247f8963be8cd31399851',ramdisk_id='',reservation_id='r-oy4uh9un',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1947115496',owner_user_name='tempest-AttachVolumeShelveTestJSON-1947115496-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='539997e65f4f4ef7998a4386d19a5e9f',uuid=e762c863-43e1-4f26-ab6b-c8ea40f08887,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d90cab57-5c57-410a-a61e-ab316454a676", "address": "fa:16:3e:07:f9:7b", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd90cab57-5c", "ovs_interfaceid": "d90cab57-5c57-410a-a61e-ab316454a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Converting VIF {"id": "d90cab57-5c57-410a-a61e-ab316454a676", "address": "fa:16:3e:07:f9:7b", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd90cab57-5c", "ovs_interfaceid": "d90cab57-5c57-410a-a61e-ab316454a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:f9:7b,bridge_name='br-int',has_traffic_filtering=True,id=d90cab57-5c57-410a-a61e-ab316454a676,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd90cab57-5c') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG os_vif [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:f9:7b,bridge_name='br-int',has_traffic_filtering=True,id=d90cab57-5c57-410a-a61e-ab316454a676,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd90cab57-5c') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd90cab57-5c, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd90cab57-5c, col_values=(('external_ids', {'iface-id': 'd90cab57-5c57-410a-a61e-ab316454a676', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:07:f9:7b', 'vm-uuid': 'e762c863-43e1-4f26-ab6b-c8ea40f08887'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:31 user nova-compute[71205]: INFO os_vif [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:f9:7b,bridge_name='br-int',has_traffic_filtering=True,id=d90cab57-5c57-410a-a61e-ab316454a676,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd90cab57-5c') Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.neutron [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Updating instance_info_cache with network_info: [{"id": "f6c98734-17ee-42d0-9372-fd76526b0b27", "address": "fa:16:3e:72:c7:21", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6c98734-17", "ovs_interfaceid": "f6c98734-17ee-42d0-9372-fd76526b0b27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Releasing lock "refresh_cache-ce19423d-a6ee-4506-9cd1-ec4803abdd86" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Instance network_info: |[{"id": "f6c98734-17ee-42d0-9372-fd76526b0b27", "address": "fa:16:3e:72:c7:21", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6c98734-17", "ovs_interfaceid": "f6c98734-17ee-42d0-9372-fd76526b0b27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] Acquired lock "refresh_cache-ce19423d-a6ee-4506-9cd1-ec4803abdd86" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.neutron [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Refreshing network info cache for port f6c98734-17ee-42d0-9372-fd76526b0b27 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Start _get_guest_xml network_info=[{"id": "f6c98734-17ee-42d0-9372-fd76526b0b27", "address": "fa:16:3e:72:c7:21", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6c98734-17", "ovs_interfaceid": "f6c98734-17ee-42d0-9372-fd76526b0b27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] No VIF found with MAC fa:16:3e:07:f9:7b, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:11:31 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:31 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-109140451',display_name='tempest-ServersNegativeTestJSON-server-109140451',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-109140451',id=4,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce75f63fc0904eceb03e8319bddba4d3',ramdisk_id='',reservation_id='r-e7rl04pc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-380105770',owner_user_name='tempest-ServersNegativeTestJSON-380105770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:23Z,user_data=None,user_id='abae98323deb44dea0622186485cc7af',uuid=ce19423d-a6ee-4506-9cd1-ec4803abdd86,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6c98734-17ee-42d0-9372-fd76526b0b27", "address": "fa:16:3e:72:c7:21", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6c98734-17", "ovs_interfaceid": "f6c98734-17ee-42d0-9372-fd76526b0b27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Converting VIF {"id": "f6c98734-17ee-42d0-9372-fd76526b0b27", "address": "fa:16:3e:72:c7:21", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6c98734-17", "ovs_interfaceid": "f6c98734-17ee-42d0-9372-fd76526b0b27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:c7:21,bridge_name='br-int',has_traffic_filtering=True,id=f6c98734-17ee-42d0-9372-fd76526b0b27,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6c98734-17') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.objects.instance [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lazy-loading 'pci_devices' on Instance uuid ce19423d-a6ee-4506-9cd1-ec4803abdd86 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] End _get_guest_xml xml= Apr 24 00:11:31 user nova-compute[71205]: ce19423d-a6ee-4506-9cd1-ec4803abdd86 Apr 24 00:11:31 user nova-compute[71205]: instance-00000004 Apr 24 00:11:31 user nova-compute[71205]: 131072 Apr 24 00:11:31 user nova-compute[71205]: 1 Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: tempest-ServersNegativeTestJSON-server-109140451 Apr 24 00:11:31 user nova-compute[71205]: 2023-04-24 00:11:31 Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: 128 Apr 24 00:11:31 user nova-compute[71205]: 1 Apr 24 00:11:31 user nova-compute[71205]: 0 Apr 24 00:11:31 user nova-compute[71205]: 0 Apr 24 00:11:31 user nova-compute[71205]: 1 Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: tempest-ServersNegativeTestJSON-380105770-project-member Apr 24 00:11:31 user nova-compute[71205]: tempest-ServersNegativeTestJSON-380105770 Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: OpenStack Foundation Apr 24 00:11:31 user nova-compute[71205]: OpenStack Nova Apr 24 00:11:31 user nova-compute[71205]: 0.0.0 Apr 24 00:11:31 user nova-compute[71205]: ce19423d-a6ee-4506-9cd1-ec4803abdd86 Apr 24 00:11:31 user nova-compute[71205]: ce19423d-a6ee-4506-9cd1-ec4803abdd86 Apr 24 00:11:31 user nova-compute[71205]: Virtual Machine Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: hvm Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Nehalem Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: /dev/urandom Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: Apr 24 00:11:31 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-109140451',display_name='tempest-ServersNegativeTestJSON-server-109140451',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-109140451',id=4,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce75f63fc0904eceb03e8319bddba4d3',ramdisk_id='',reservation_id='r-e7rl04pc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-380105770',owner_user_name='tempest-ServersNegativeTestJSON-380105770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:23Z,user_data=None,user_id='abae98323deb44dea0622186485cc7af',uuid=ce19423d-a6ee-4506-9cd1-ec4803abdd86,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6c98734-17ee-42d0-9372-fd76526b0b27", "address": "fa:16:3e:72:c7:21", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6c98734-17", "ovs_interfaceid": "f6c98734-17ee-42d0-9372-fd76526b0b27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Converting VIF {"id": "f6c98734-17ee-42d0-9372-fd76526b0b27", "address": "fa:16:3e:72:c7:21", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6c98734-17", "ovs_interfaceid": "f6c98734-17ee-42d0-9372-fd76526b0b27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:c7:21,bridge_name='br-int',has_traffic_filtering=True,id=f6c98734-17ee-42d0-9372-fd76526b0b27,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6c98734-17') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG os_vif [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:c7:21,bridge_name='br-int',has_traffic_filtering=True,id=f6c98734-17ee-42d0-9372-fd76526b0b27,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6c98734-17') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6c98734-17, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf6c98734-17, col_values=(('external_ids', {'iface-id': 'f6c98734-17ee-42d0-9372-fd76526b0b27', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:72:c7:21', 'vm-uuid': 'ce19423d-a6ee-4506-9cd1-ec4803abdd86'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:31 user nova-compute[71205]: INFO os_vif [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:c7:21,bridge_name='br-int',has_traffic_filtering=True,id=f6c98734-17ee-42d0-9372-fd76526b0b27,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6c98734-17') Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:31 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] VM Resumed (Lifecycle Event) Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] No VIF found with MAC fa:16:3e:72:c7:21, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:11:31 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Instance spawned successfully. Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:31 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:31 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] VM Started (Lifecycle Event) Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:31 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:11:31 user nova-compute[71205]: INFO nova.compute.manager [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Took 13.14 seconds to spawn the instance on the hypervisor. Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:31 user nova-compute[71205]: INFO nova.compute.manager [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Took 13.81 seconds to build instance. Apr 24 00:11:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b74c7229-9d28-463a-956f-273cb6996b53 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.988s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.neutron [req-fb6f8e56-a841-4bd2-bb76-6a12a4205803 req-e1cfde75-4ca4-4f79-b22c-d072821f80d6 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Updated VIF entry in instance network info cache for port be0c060b-e1fe-496e-8827-a2699e8a4017. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG nova.network.neutron [req-fb6f8e56-a841-4bd2-bb76-6a12a4205803 req-e1cfde75-4ca4-4f79-b22c-d072821f80d6 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Updating instance_info_cache with network_info: [{"id": "be0c060b-e1fe-496e-8827-a2699e8a4017", "address": "fa:16:3e:d3:61:2a", "network": {"id": "431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-800590822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "218aba2df07b4afaa999399d0981e6bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe0c060b-e1", "ovs_interfaceid": "be0c060b-e1fe-496e-8827-a2699e8a4017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-fb6f8e56-a841-4bd2-bb76-6a12a4205803 req-e1cfde75-4ca4-4f79-b22c-d072821f80d6 service nova] Releasing lock "refresh_cache-ffbf17ce-e3cb-4099-bea3-6887fef4476d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG nova.compute.manager [req-c98a3cb3-6f32-410e-babf-bb962886585f req-11bfa404-cccc-40a2-85df-d2eee54b6cdc service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received event network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c98a3cb3-6f32-410e-babf-bb962886585f req-11bfa404-cccc-40a2-85df-d2eee54b6cdc service nova] Acquiring lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c98a3cb3-6f32-410e-babf-bb962886585f req-11bfa404-cccc-40a2-85df-d2eee54b6cdc service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c98a3cb3-6f32-410e-babf-bb962886585f req-11bfa404-cccc-40a2-85df-d2eee54b6cdc service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG nova.compute.manager [req-c98a3cb3-6f32-410e-babf-bb962886585f req-11bfa404-cccc-40a2-85df-d2eee54b6cdc service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] No waiting events found dispatching network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:32 user nova-compute[71205]: WARNING nova.compute.manager [req-c98a3cb3-6f32-410e-babf-bb962886585f req-11bfa404-cccc-40a2-85df-d2eee54b6cdc service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received unexpected event network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 for instance with vm_state building and task_state spawning. Apr 24 00:11:32 user nova-compute[71205]: DEBUG nova.network.neutron [req-31f698ef-364c-4709-b077-41f752094bd1 req-9d9688b2-7f9c-4137-9f8e-113595664605 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Updated VIF entry in instance network info cache for port d90cab57-5c57-410a-a61e-ab316454a676. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG nova.network.neutron [req-31f698ef-364c-4709-b077-41f752094bd1 req-9d9688b2-7f9c-4137-9f8e-113595664605 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Updating instance_info_cache with network_info: [{"id": "d90cab57-5c57-410a-a61e-ab316454a676", "address": "fa:16:3e:07:f9:7b", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd90cab57-5c", "ovs_interfaceid": "d90cab57-5c57-410a-a61e-ab316454a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-31f698ef-364c-4709-b077-41f752094bd1 req-9d9688b2-7f9c-4137-9f8e-113595664605 service nova] Releasing lock "refresh_cache-e762c863-43e1-4f26-ab6b-c8ea40f08887" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG nova.network.neutron [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Successfully created port: 7a3b1d96-2c84-4994-b698-c59fb56c44f8 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG nova.network.neutron [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Updated VIF entry in instance network info cache for port f6c98734-17ee-42d0-9372-fd76526b0b27. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG nova.network.neutron [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Updating instance_info_cache with network_info: [{"id": "f6c98734-17ee-42d0-9372-fd76526b0b27", "address": "fa:16:3e:72:c7:21", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6c98734-17", "ovs_interfaceid": "f6c98734-17ee-42d0-9372-fd76526b0b27", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c36bdc29-daa4-404e-8d29-6dbc25ef5aba req-2bc398fa-31a6-4806-b6fa-e42f1bcb0085 service nova] Releasing lock "refresh_cache-ce19423d-a6ee-4506-9cd1-ec4803abdd86" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG nova.compute.manager [req-53e803cc-a556-4097-a68c-e0e4fe70ab24 req-9173f84d-2465-4799-b979-760f022606de service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Received event network-vif-plugged-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-53e803cc-a556-4097-a68c-e0e4fe70ab24 req-9173f84d-2465-4799-b979-760f022606de service nova] Acquiring lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-53e803cc-a556-4097-a68c-e0e4fe70ab24 req-9173f84d-2465-4799-b979-760f022606de service nova] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-53e803cc-a556-4097-a68c-e0e4fe70ab24 req-9173f84d-2465-4799-b979-760f022606de service nova] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:32 user nova-compute[71205]: DEBUG nova.compute.manager [req-53e803cc-a556-4097-a68c-e0e4fe70ab24 req-9173f84d-2465-4799-b979-760f022606de service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] No waiting events found dispatching network-vif-plugged-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:32 user nova-compute[71205]: WARNING nova.compute.manager [req-53e803cc-a556-4097-a68c-e0e4fe70ab24 req-9173f84d-2465-4799-b979-760f022606de service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Received unexpected event network-vif-plugged-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 for instance with vm_state active and task_state None. Apr 24 00:11:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:34 user nova-compute[71205]: DEBUG nova.compute.manager [req-ad41b5ab-3f7e-4c17-8d94-bf4171b8a86c req-0d248583-19dc-4755-abd3-be36caa502f5 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received event network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ad41b5ab-3f7e-4c17-8d94-bf4171b8a86c req-0d248583-19dc-4755-abd3-be36caa502f5 service nova] Acquiring lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ad41b5ab-3f7e-4c17-8d94-bf4171b8a86c req-0d248583-19dc-4755-abd3-be36caa502f5 service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ad41b5ab-3f7e-4c17-8d94-bf4171b8a86c req-0d248583-19dc-4755-abd3-be36caa502f5 service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:34 user nova-compute[71205]: DEBUG nova.compute.manager [req-ad41b5ab-3f7e-4c17-8d94-bf4171b8a86c req-0d248583-19dc-4755-abd3-be36caa502f5 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] No waiting events found dispatching network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:34 user nova-compute[71205]: WARNING nova.compute.manager [req-ad41b5ab-3f7e-4c17-8d94-bf4171b8a86c req-0d248583-19dc-4755-abd3-be36caa502f5 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received unexpected event network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 for instance with vm_state building and task_state spawning. Apr 24 00:11:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:35 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:35 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:35 user nova-compute[71205]: DEBUG nova.network.neutron [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Successfully updated port: 7a3b1d96-2c84-4994-b698-c59fb56c44f8 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:11:35 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "refresh_cache-c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:35 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquired lock "refresh_cache-c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:35 user nova-compute[71205]: DEBUG nova.network.neutron [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:11:35 user nova-compute[71205]: DEBUG nova.compute.manager [req-da02cd48-8d5d-4179-8fe5-886bc329445e req-24a251e2-a64c-4f79-984d-2a24bbbeee22 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Received event network-changed-7a3b1d96-2c84-4994-b698-c59fb56c44f8 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:35 user nova-compute[71205]: DEBUG nova.compute.manager [req-da02cd48-8d5d-4179-8fe5-886bc329445e req-24a251e2-a64c-4f79-984d-2a24bbbeee22 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Refreshing instance network info cache due to event network-changed-7a3b1d96-2c84-4994-b698-c59fb56c44f8. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:11:35 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-da02cd48-8d5d-4179-8fe5-886bc329445e req-24a251e2-a64c-4f79-984d-2a24bbbeee22 service nova] Acquiring lock "refresh_cache-c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:36 user nova-compute[71205]: DEBUG nova.network.neutron [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:11:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:36 user nova-compute[71205]: DEBUG nova.compute.manager [req-fa79a7b0-9332-4ab4-bd5e-ec4c2224d4c4 req-538657c3-35b5-490d-b765-422cd2843ad0 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Received event network-vif-plugged-d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-fa79a7b0-9332-4ab4-bd5e-ec4c2224d4c4 req-538657c3-35b5-490d-b765-422cd2843ad0 service nova] Acquiring lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-fa79a7b0-9332-4ab4-bd5e-ec4c2224d4c4 req-538657c3-35b5-490d-b765-422cd2843ad0 service nova] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-fa79a7b0-9332-4ab4-bd5e-ec4c2224d4c4 req-538657c3-35b5-490d-b765-422cd2843ad0 service nova] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:36 user nova-compute[71205]: DEBUG nova.compute.manager [req-fa79a7b0-9332-4ab4-bd5e-ec4c2224d4c4 req-538657c3-35b5-490d-b765-422cd2843ad0 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] No waiting events found dispatching network-vif-plugged-d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:36 user nova-compute[71205]: WARNING nova.compute.manager [req-fa79a7b0-9332-4ab4-bd5e-ec4c2224d4c4 req-538657c3-35b5-490d-b765-422cd2843ad0 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Received unexpected event network-vif-plugged-d90cab57-5c57-410a-a61e-ab316454a676 for instance with vm_state building and task_state spawning. Apr 24 00:11:36 user nova-compute[71205]: DEBUG nova.compute.manager [req-fa79a7b0-9332-4ab4-bd5e-ec4c2224d4c4 req-538657c3-35b5-490d-b765-422cd2843ad0 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Received event network-vif-plugged-d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-fa79a7b0-9332-4ab4-bd5e-ec4c2224d4c4 req-538657c3-35b5-490d-b765-422cd2843ad0 service nova] Acquiring lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-fa79a7b0-9332-4ab4-bd5e-ec4c2224d4c4 req-538657c3-35b5-490d-b765-422cd2843ad0 service nova] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-fa79a7b0-9332-4ab4-bd5e-ec4c2224d4c4 req-538657c3-35b5-490d-b765-422cd2843ad0 service nova] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:36 user nova-compute[71205]: DEBUG nova.compute.manager [req-fa79a7b0-9332-4ab4-bd5e-ec4c2224d4c4 req-538657c3-35b5-490d-b765-422cd2843ad0 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] No waiting events found dispatching network-vif-plugged-d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:36 user nova-compute[71205]: WARNING nova.compute.manager [req-fa79a7b0-9332-4ab4-bd5e-ec4c2224d4c4 req-538657c3-35b5-490d-b765-422cd2843ad0 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Received unexpected event network-vif-plugged-d90cab57-5c57-410a-a61e-ab316454a676 for instance with vm_state building and task_state spawning. Apr 24 00:11:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.network.neutron [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Updating instance_info_cache with network_info: [{"id": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "address": "fa:16:3e:06:2a:5d", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a3b1d96-2c", "ovs_interfaceid": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] VM Resumed (Lifecycle Event) Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Releasing lock "refresh_cache-c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Instance network_info: |[{"id": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "address": "fa:16:3e:06:2a:5d", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a3b1d96-2c", "ovs_interfaceid": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-da02cd48-8d5d-4179-8fe5-886bc329445e req-24a251e2-a64c-4f79-984d-2a24bbbeee22 service nova] Acquired lock "refresh_cache-c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.network.neutron [req-da02cd48-8d5d-4179-8fe5-886bc329445e req-24a251e2-a64c-4f79-984d-2a24bbbeee22 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Refreshing network info cache for port 7a3b1d96-2c84-4994-b698-c59fb56c44f8 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Start _get_guest_xml network_info=[{"id": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "address": "fa:16:3e:06:2a:5d", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a3b1d96-2c", "ovs_interfaceid": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Instance spawned successfully. Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:37 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:37 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1137356929',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1137356929',id=6,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d26ba1ed4b9241f9a084db1a14a945bb',ramdisk_id='',reservation_id='r-jniph3u4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:28Z,user_data=None,user_id='8d0ab07106dd4995aa7e3f5b6bc70e56',uuid=c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "address": "fa:16:3e:06:2a:5d", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a3b1d96-2c", "ovs_interfaceid": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converting VIF {"id": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "address": "fa:16:3e:06:2a:5d", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a3b1d96-2c", "ovs_interfaceid": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:2a:5d,bridge_name='br-int',has_traffic_filtering=True,id=7a3b1d96-2c84-4994-b698-c59fb56c44f8,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a3b1d96-2c') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.objects.instance [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lazy-loading 'pci_devices' on Instance uuid c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] VM Started (Lifecycle Event) Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] End _get_guest_xml xml= Apr 24 00:11:37 user nova-compute[71205]: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b Apr 24 00:11:37 user nova-compute[71205]: instance-00000006 Apr 24 00:11:37 user nova-compute[71205]: 131072 Apr 24 00:11:37 user nova-compute[71205]: 1 Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: tempest-ServerBootFromVolumeStableRescueTest-server-1137356929 Apr 24 00:11:37 user nova-compute[71205]: 2023-04-24 00:11:37 Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: 128 Apr 24 00:11:37 user nova-compute[71205]: 1 Apr 24 00:11:37 user nova-compute[71205]: 0 Apr 24 00:11:37 user nova-compute[71205]: 0 Apr 24 00:11:37 user nova-compute[71205]: 1 Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member Apr 24 00:11:37 user nova-compute[71205]: tempest-ServerBootFromVolumeStableRescueTest-2021792443 Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: OpenStack Foundation Apr 24 00:11:37 user nova-compute[71205]: OpenStack Nova Apr 24 00:11:37 user nova-compute[71205]: 0.0.0 Apr 24 00:11:37 user nova-compute[71205]: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b Apr 24 00:11:37 user nova-compute[71205]: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b Apr 24 00:11:37 user nova-compute[71205]: Virtual Machine Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: hvm Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Nehalem Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: /dev/urandom Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: Apr 24 00:11:37 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1137356929',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1137356929',id=6,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d26ba1ed4b9241f9a084db1a14a945bb',ramdisk_id='',reservation_id='r-jniph3u4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:28Z,user_data=None,user_id='8d0ab07106dd4995aa7e3f5b6bc70e56',uuid=c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "address": "fa:16:3e:06:2a:5d", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a3b1d96-2c", "ovs_interfaceid": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converting VIF {"id": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "address": "fa:16:3e:06:2a:5d", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a3b1d96-2c", "ovs_interfaceid": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:2a:5d,bridge_name='br-int',has_traffic_filtering=True,id=7a3b1d96-2c84-4994-b698-c59fb56c44f8,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a3b1d96-2c') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG os_vif [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:2a:5d,bridge_name='br-int',has_traffic_filtering=True,id=7a3b1d96-2c84-4994-b698-c59fb56c44f8,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a3b1d96-2c') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a3b1d96-2c, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7a3b1d96-2c, col_values=(('external_ids', {'iface-id': '7a3b1d96-2c84-4994-b698-c59fb56c44f8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:2a:5d', 'vm-uuid': 'c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Instance spawned successfully. Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:37 user nova-compute[71205]: INFO os_vif [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:2a:5d,bridge_name='br-int',has_traffic_filtering=True,id=7a3b1d96-2c84-4994-b698-c59fb56c44f8,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a3b1d96-2c') Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] VM Resumed (Lifecycle Event) Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] No VIF found with MAC fa:16:3e:06:2a:5d, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.compute.manager [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Took 16.27 seconds to spawn the instance on the hypervisor. Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Instance spawned successfully. Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] VM Started (Lifecycle Event) Apr 24 00:11:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.compute.manager [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Took 12.12 seconds to spawn the instance on the hypervisor. Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] VM Resumed (Lifecycle Event) Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:37 user nova-compute[71205]: INFO nova.compute.manager [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Took 14.74 seconds to spawn the instance on the hypervisor. Apr 24 00:11:37 user nova-compute[71205]: DEBUG nova.compute.manager [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:38 user nova-compute[71205]: INFO nova.compute.manager [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Took 13.00 seconds to build instance. Apr 24 00:11:38 user nova-compute[71205]: INFO nova.compute.manager [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Took 17.52 seconds to build instance. Apr 24 00:11:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3b98375f-a1ac-48a9-9068-b667063f2734 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.140s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:38 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:11:38 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:38 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] VM Started (Lifecycle Event) Apr 24 00:11:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-8dca2f8d-75fa-44e1-b2b1-bce91aed6ae4 tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.810s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:38 user nova-compute[71205]: INFO nova.compute.manager [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Took 15.63 seconds to build instance. Apr 24 00:11:38 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-398c0f12-a93a-47b1-9e73-b31b857c5a44 tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.826s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG nova.compute.manager [req-bdd48bd3-a77d-442e-a407-78312631bbfd req-5e8e1b2e-a19d-48d2-9ccc-303991589f09 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Received event network-vif-plugged-f6c98734-17ee-42d0-9372-fd76526b0b27 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bdd48bd3-a77d-442e-a407-78312631bbfd req-5e8e1b2e-a19d-48d2-9ccc-303991589f09 service nova] Acquiring lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bdd48bd3-a77d-442e-a407-78312631bbfd req-5e8e1b2e-a19d-48d2-9ccc-303991589f09 service nova] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bdd48bd3-a77d-442e-a407-78312631bbfd req-5e8e1b2e-a19d-48d2-9ccc-303991589f09 service nova] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG nova.compute.manager [req-bdd48bd3-a77d-442e-a407-78312631bbfd req-5e8e1b2e-a19d-48d2-9ccc-303991589f09 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] No waiting events found dispatching network-vif-plugged-f6c98734-17ee-42d0-9372-fd76526b0b27 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:38 user nova-compute[71205]: WARNING nova.compute.manager [req-bdd48bd3-a77d-442e-a407-78312631bbfd req-5e8e1b2e-a19d-48d2-9ccc-303991589f09 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Received unexpected event network-vif-plugged-f6c98734-17ee-42d0-9372-fd76526b0b27 for instance with vm_state active and task_state None. Apr 24 00:11:38 user nova-compute[71205]: DEBUG nova.compute.manager [req-bdd48bd3-a77d-442e-a407-78312631bbfd req-5e8e1b2e-a19d-48d2-9ccc-303991589f09 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Received event network-vif-plugged-f6c98734-17ee-42d0-9372-fd76526b0b27 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bdd48bd3-a77d-442e-a407-78312631bbfd req-5e8e1b2e-a19d-48d2-9ccc-303991589f09 service nova] Acquiring lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bdd48bd3-a77d-442e-a407-78312631bbfd req-5e8e1b2e-a19d-48d2-9ccc-303991589f09 service nova] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bdd48bd3-a77d-442e-a407-78312631bbfd req-5e8e1b2e-a19d-48d2-9ccc-303991589f09 service nova] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:38 user nova-compute[71205]: DEBUG nova.compute.manager [req-bdd48bd3-a77d-442e-a407-78312631bbfd req-5e8e1b2e-a19d-48d2-9ccc-303991589f09 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] No waiting events found dispatching network-vif-plugged-f6c98734-17ee-42d0-9372-fd76526b0b27 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:38 user nova-compute[71205]: WARNING nova.compute.manager [req-bdd48bd3-a77d-442e-a407-78312631bbfd req-5e8e1b2e-a19d-48d2-9ccc-303991589f09 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Received unexpected event network-vif-plugged-f6c98734-17ee-42d0-9372-fd76526b0b27 for instance with vm_state active and task_state None. Apr 24 00:11:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:39 user nova-compute[71205]: DEBUG nova.network.neutron [req-da02cd48-8d5d-4179-8fe5-886bc329445e req-24a251e2-a64c-4f79-984d-2a24bbbeee22 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Updated VIF entry in instance network info cache for port 7a3b1d96-2c84-4994-b698-c59fb56c44f8. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:11:39 user nova-compute[71205]: DEBUG nova.network.neutron [req-da02cd48-8d5d-4179-8fe5-886bc329445e req-24a251e2-a64c-4f79-984d-2a24bbbeee22 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Updating instance_info_cache with network_info: [{"id": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "address": "fa:16:3e:06:2a:5d", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a3b1d96-2c", "ovs_interfaceid": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-da02cd48-8d5d-4179-8fe5-886bc329445e req-24a251e2-a64c-4f79-984d-2a24bbbeee22 service nova] Releasing lock "refresh_cache-c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:41 user nova-compute[71205]: DEBUG nova.compute.manager [req-af7534d2-b327-40f1-80e2-e2e1a32757aa req-0cf7283e-d93e-41c1-9ccf-fbc78d4d4cb2 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Received event network-vif-plugged-7a3b1d96-2c84-4994-b698-c59fb56c44f8 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-af7534d2-b327-40f1-80e2-e2e1a32757aa req-0cf7283e-d93e-41c1-9ccf-fbc78d4d4cb2 service nova] Acquiring lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-af7534d2-b327-40f1-80e2-e2e1a32757aa req-0cf7283e-d93e-41c1-9ccf-fbc78d4d4cb2 service nova] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-af7534d2-b327-40f1-80e2-e2e1a32757aa req-0cf7283e-d93e-41c1-9ccf-fbc78d4d4cb2 service nova] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:41 user nova-compute[71205]: DEBUG nova.compute.manager [req-af7534d2-b327-40f1-80e2-e2e1a32757aa req-0cf7283e-d93e-41c1-9ccf-fbc78d4d4cb2 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] No waiting events found dispatching network-vif-plugged-7a3b1d96-2c84-4994-b698-c59fb56c44f8 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:41 user nova-compute[71205]: WARNING nova.compute.manager [req-af7534d2-b327-40f1-80e2-e2e1a32757aa req-0cf7283e-d93e-41c1-9ccf-fbc78d4d4cb2 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Received unexpected event network-vif-plugged-7a3b1d96-2c84-4994-b698-c59fb56c44f8 for instance with vm_state building and task_state spawning. Apr 24 00:11:42 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:42 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] VM Resumed (Lifecycle Event) Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.compute.manager [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:11:42 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Instance spawned successfully. Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:42 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:42 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] VM Started (Lifecycle Event) Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:42 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:11:42 user nova-compute[71205]: INFO nova.compute.manager [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Took 14.66 seconds to spawn the instance on the hypervisor. Apr 24 00:11:42 user nova-compute[71205]: DEBUG nova.compute.manager [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:43 user nova-compute[71205]: INFO nova.compute.manager [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Took 15.54 seconds to build instance. Apr 24 00:11:43 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-27c2118c-033b-4842-b090-af2c77b3c2fb tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.678s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:44 user nova-compute[71205]: DEBUG nova.compute.manager [req-f6a6fb5b-5721-468b-a22f-49071f6852d6 req-347e9524-6366-40fb-b1c8-151d0d2f97be service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Received event network-vif-plugged-7a3b1d96-2c84-4994-b698-c59fb56c44f8 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:44 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f6a6fb5b-5721-468b-a22f-49071f6852d6 req-347e9524-6366-40fb-b1c8-151d0d2f97be service nova] Acquiring lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:44 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f6a6fb5b-5721-468b-a22f-49071f6852d6 req-347e9524-6366-40fb-b1c8-151d0d2f97be service nova] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:44 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f6a6fb5b-5721-468b-a22f-49071f6852d6 req-347e9524-6366-40fb-b1c8-151d0d2f97be service nova] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:44 user nova-compute[71205]: DEBUG nova.compute.manager [req-f6a6fb5b-5721-468b-a22f-49071f6852d6 req-347e9524-6366-40fb-b1c8-151d0d2f97be service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] No waiting events found dispatching network-vif-plugged-7a3b1d96-2c84-4994-b698-c59fb56c44f8 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:44 user nova-compute[71205]: WARNING nova.compute.manager [req-f6a6fb5b-5721-468b-a22f-49071f6852d6 req-347e9524-6366-40fb-b1c8-151d0d2f97be service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Received unexpected event network-vif-plugged-7a3b1d96-2c84-4994-b698-c59fb56c44f8 for instance with vm_state active and task_state None. Apr 24 00:11:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:47 user nova-compute[71205]: DEBUG nova.compute.manager [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:11:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:47 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:11:47 user nova-compute[71205]: INFO nova.compute.claims [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Claim successful on node user Apr 24 00:11:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.577s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG nova.network.neutron [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:11:48 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:11:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG nova.policy [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '640ec20e46a2422a8aabcc152e522e02', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'df0187dbb10d42da941645107df203f6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:11:48 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Creating image(s) Apr 24 00:11:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "/opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "/opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "/opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.193s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.003s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.164s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:48 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk 1073741824" returned: 0 in 0.075s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.249s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.169s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Checking if we can resize image /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json" returned: 0 in 0.182s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Cannot resize image /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG nova.objects.instance [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lazy-loading 'migration_context' on Instance uuid dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Ensure instance console log exists: /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:50 user nova-compute[71205]: DEBUG nova.network.neutron [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Successfully created port: d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:11:52 user nova-compute[71205]: DEBUG nova.network.neutron [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Successfully updated port: d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:11:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "refresh_cache-dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquired lock "refresh_cache-dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:52 user nova-compute[71205]: DEBUG nova.network.neutron [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:11:52 user nova-compute[71205]: DEBUG nova.network.neutron [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:11:52 user nova-compute[71205]: DEBUG nova.compute.manager [req-5c21f88b-6e2a-4997-a08a-f90868b64c4f req-86e169e0-4de0-42fc-a0fc-0adef6859759 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Received event network-changed-d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:52 user nova-compute[71205]: DEBUG nova.compute.manager [req-5c21f88b-6e2a-4997-a08a-f90868b64c4f req-86e169e0-4de0-42fc-a0fc-0adef6859759 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Refreshing instance network info cache due to event network-changed-d6f98d8d-f918-4aa5-abb0-a34e782f890a. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:11:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-5c21f88b-6e2a-4997-a08a-f90868b64c4f req-86e169e0-4de0-42fc-a0fc-0adef6859759 service nova] Acquiring lock "refresh_cache-dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Acquiring lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:52 user nova-compute[71205]: DEBUG nova.compute.manager [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.network.neutron [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Updating instance_info_cache with network_info: [{"id": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "address": "fa:16:3e:b2:2c:52", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6f98d8d-f9", "ovs_interfaceid": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:11:53 user nova-compute[71205]: INFO nova.compute.claims [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Claim successful on node user Apr 24 00:11:53 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Releasing lock "refresh_cache-dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.compute.manager [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Instance network_info: |[{"id": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "address": "fa:16:3e:b2:2c:52", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6f98d8d-f9", "ovs_interfaceid": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-5c21f88b-6e2a-4997-a08a-f90868b64c4f req-86e169e0-4de0-42fc-a0fc-0adef6859759 service nova] Acquired lock "refresh_cache-dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.network.neutron [req-5c21f88b-6e2a-4997-a08a-f90868b64c4f req-86e169e0-4de0-42fc-a0fc-0adef6859759 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Refreshing network info cache for port d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Start _get_guest_xml network_info=[{"id": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "address": "fa:16:3e:b2:2c:52", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6f98d8d-f9", "ovs_interfaceid": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:11:53 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:53 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-624022111',display_name='tempest-VolumesAdminNegativeTest-server-624022111',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-624022111',id=7,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLS1AjVipKW5cEt2rl35m09NWv1oXpefnHcFJOdwsnsIp5JPDa2HbS1qzIbePN1In3l/JpDLeRBWNBZrjWWOmQw0RrMjdQbQyn62Y/HmMBLXcgD+X7ygv99DAQ58UKkQKw==',key_name='tempest-keypair-324105185',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='df0187dbb10d42da941645107df203f6',ramdisk_id='',reservation_id='r-tapv9pec',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-821594302',owner_user_name='tempest-VolumesAdminNegativeTest-821594302-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='640ec20e46a2422a8aabcc152e522e02',uuid=dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "address": "fa:16:3e:b2:2c:52", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6f98d8d-f9", "ovs_interfaceid": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Converting VIF {"id": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "address": "fa:16:3e:b2:2c:52", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6f98d8d-f9", "ovs_interfaceid": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:2c:52,bridge_name='br-int',has_traffic_filtering=True,id=d6f98d8d-f918-4aa5-abb0-a34e782f890a,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6f98d8d-f9') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.objects.instance [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lazy-loading 'pci_devices' on Instance uuid dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] End _get_guest_xml xml= Apr 24 00:11:53 user nova-compute[71205]: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d Apr 24 00:11:53 user nova-compute[71205]: instance-00000007 Apr 24 00:11:53 user nova-compute[71205]: 131072 Apr 24 00:11:53 user nova-compute[71205]: 1 Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: tempest-VolumesAdminNegativeTest-server-624022111 Apr 24 00:11:53 user nova-compute[71205]: 2023-04-24 00:11:53 Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: 128 Apr 24 00:11:53 user nova-compute[71205]: 1 Apr 24 00:11:53 user nova-compute[71205]: 0 Apr 24 00:11:53 user nova-compute[71205]: 0 Apr 24 00:11:53 user nova-compute[71205]: 1 Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: tempest-VolumesAdminNegativeTest-821594302-project-member Apr 24 00:11:53 user nova-compute[71205]: tempest-VolumesAdminNegativeTest-821594302 Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: OpenStack Foundation Apr 24 00:11:53 user nova-compute[71205]: OpenStack Nova Apr 24 00:11:53 user nova-compute[71205]: 0.0.0 Apr 24 00:11:53 user nova-compute[71205]: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d Apr 24 00:11:53 user nova-compute[71205]: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d Apr 24 00:11:53 user nova-compute[71205]: Virtual Machine Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: hvm Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Nehalem Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: /dev/urandom Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: Apr 24 00:11:53 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-624022111',display_name='tempest-VolumesAdminNegativeTest-server-624022111',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-624022111',id=7,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLS1AjVipKW5cEt2rl35m09NWv1oXpefnHcFJOdwsnsIp5JPDa2HbS1qzIbePN1In3l/JpDLeRBWNBZrjWWOmQw0RrMjdQbQyn62Y/HmMBLXcgD+X7ygv99DAQ58UKkQKw==',key_name='tempest-keypair-324105185',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='df0187dbb10d42da941645107df203f6',ramdisk_id='',reservation_id='r-tapv9pec',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-821594302',owner_user_name='tempest-VolumesAdminNegativeTest-821594302-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='640ec20e46a2422a8aabcc152e522e02',uuid=dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "address": "fa:16:3e:b2:2c:52", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6f98d8d-f9", "ovs_interfaceid": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Converting VIF {"id": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "address": "fa:16:3e:b2:2c:52", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6f98d8d-f9", "ovs_interfaceid": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:2c:52,bridge_name='br-int',has_traffic_filtering=True,id=d6f98d8d-f918-4aa5-abb0-a34e782f890a,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6f98d8d-f9') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG os_vif [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:2c:52,bridge_name='br-int',has_traffic_filtering=True,id=d6f98d8d-f918-4aa5-abb0-a34e782f890a,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6f98d8d-f9') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6f98d8d-f9, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd6f98d8d-f9, col_values=(('external_ids', {'iface-id': 'd6f98d8d-f918-4aa5-abb0-a34e782f890a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b2:2c:52', 'vm-uuid': 'dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:53 user nova-compute[71205]: INFO os_vif [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:2c:52,bridge_name='br-int',has_traffic_filtering=True,id=d6f98d8d-f918-4aa5-abb0-a34e782f890a,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6f98d8d-f9') Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] No VIF found with MAC fa:16:3e:b2:2c:52, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.664s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.compute.manager [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.network.neutron [req-5c21f88b-6e2a-4997-a08a-f90868b64c4f req-86e169e0-4de0-42fc-a0fc-0adef6859759 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Updated VIF entry in instance network info cache for port d6f98d8d-f918-4aa5-abb0-a34e782f890a. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.network.neutron [req-5c21f88b-6e2a-4997-a08a-f90868b64c4f req-86e169e0-4de0-42fc-a0fc-0adef6859759 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Updating instance_info_cache with network_info: [{"id": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "address": "fa:16:3e:b2:2c:52", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6f98d8d-f9", "ovs_interfaceid": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-5c21f88b-6e2a-4997-a08a-f90868b64c4f req-86e169e0-4de0-42fc-a0fc-0adef6859759 service nova] Releasing lock "refresh_cache-dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.compute.manager [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.network.neutron [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:11:53 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names Apr 24 00:11:53 user nova-compute[71205]: DEBUG nova.compute.manager [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG nova.policy [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '04625ea29ba641fc8342441f61274d4f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e163039cafd4b2880a41ded2e2f7d00', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG nova.compute.manager [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:11:54 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Creating image(s) Apr 24 00:11:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Acquiring lock "/opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "/opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "/opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Acquiring lock "bd9ce8fa9784f2fc4cac817a53d1a4003311b195" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "bd9ce8fa9784f2fc4cac817a53d1a4003311b195" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195.part --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195.part --force-share --output=json" returned: 0 in 0.168s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG nova.virt.images [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] 6d0fc2e0-41f4-457d-aa83-7dd6fd114687 was qcow2, converting to raw {{(pid=71205) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG nova.privsep.utils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71205) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195.part /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195.converted {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195.part /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195.converted" returned: 0 in 0.165s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195.converted --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG nova.compute.manager [req-a190cd02-3bf5-4f48-9b4a-1b11727bd82f req-7c20dae5-65ee-4645-9e64-ab39746795f0 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Received event network-vif-plugged-d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-a190cd02-3bf5-4f48-9b4a-1b11727bd82f req-7c20dae5-65ee-4645-9e64-ab39746795f0 service nova] Acquiring lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-a190cd02-3bf5-4f48-9b4a-1b11727bd82f req-7c20dae5-65ee-4645-9e64-ab39746795f0 service nova] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-a190cd02-3bf5-4f48-9b4a-1b11727bd82f req-7c20dae5-65ee-4645-9e64-ab39746795f0 service nova] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.015s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:54 user nova-compute[71205]: DEBUG nova.compute.manager [req-a190cd02-3bf5-4f48-9b4a-1b11727bd82f req-7c20dae5-65ee-4645-9e64-ab39746795f0 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] No waiting events found dispatching network-vif-plugged-d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:54 user nova-compute[71205]: WARNING nova.compute.manager [req-a190cd02-3bf5-4f48-9b4a-1b11727bd82f req-7c20dae5-65ee-4645-9e64-ab39746795f0 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Received unexpected event network-vif-plugged-d6f98d8d-f918-4aa5-abb0-a34e782f890a for instance with vm_state building and task_state spawning. Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195.converted --force-share --output=json" returned: 0 in 0.223s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "bd9ce8fa9784f2fc4cac817a53d1a4003311b195" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.908s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195 --force-share --output=json" returned: 0 in 0.145s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Acquiring lock "bd9ce8fa9784f2fc4cac817a53d1a4003311b195" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "bd9ce8fa9784f2fc4cac817a53d1a4003311b195" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG nova.network.neutron [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Successfully created port: 5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195 --force-share --output=json" returned: 0 in 0.169s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195,backing_fmt=raw /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195,backing_fmt=raw /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk 1073741824" returned: 0 in 0.062s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "bd9ce8fa9784f2fc4cac817a53d1a4003311b195" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.238s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/bd9ce8fa9784f2fc4cac817a53d1a4003311b195 --force-share --output=json" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Checking if we can resize image /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Cannot resize image /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG nova.objects.instance [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lazy-loading 'migration_context' on Instance uuid 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Ensure instance console log exists: /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:55 user nova-compute[71205]: DEBUG nova.network.neutron [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Successfully updated port: 5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Acquiring lock "refresh_cache-5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Acquired lock "refresh_cache-5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.network.neutron [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.compute.manager [req-f1c74533-bae0-4ba6-a55c-a09af34fbce4 req-3a7190ba-fb51-4f3e-8613-d5b46cb9d094 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Received event network-changed-5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.compute.manager [req-f1c74533-bae0-4ba6-a55c-a09af34fbce4 req-3a7190ba-fb51-4f3e-8613-d5b46cb9d094 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Refreshing instance network info cache due to event network-changed-5a56c96a-0083-47e5-819c-1d802bbcd6ea. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f1c74533-bae0-4ba6-a55c-a09af34fbce4 req-3a7190ba-fb51-4f3e-8613-d5b46cb9d094 service nova] Acquiring lock "refresh_cache-5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.network.neutron [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.network.neutron [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Updating instance_info_cache with network_info: [{"id": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "address": "fa:16:3e:04:8c:34", "network": {"id": "60f29626-4198-42e4-835f-2d0d9cfabf8d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-973896992-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6e163039cafd4b2880a41ded2e2f7d00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a56c96a-00", "ovs_interfaceid": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Releasing lock "refresh_cache-5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.compute.manager [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Instance network_info: |[{"id": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "address": "fa:16:3e:04:8c:34", "network": {"id": "60f29626-4198-42e4-835f-2d0d9cfabf8d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-973896992-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6e163039cafd4b2880a41ded2e2f7d00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a56c96a-00", "ovs_interfaceid": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f1c74533-bae0-4ba6-a55c-a09af34fbce4 req-3a7190ba-fb51-4f3e-8613-d5b46cb9d094 service nova] Acquired lock "refresh_cache-5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.network.neutron [req-f1c74533-bae0-4ba6-a55c-a09af34fbce4 req-3a7190ba-fb51-4f3e-8613-d5b46cb9d094 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Refreshing network info cache for port 5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Start _get_guest_xml network_info=[{"id": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "address": "fa:16:3e:04:8c:34", "network": {"id": "60f29626-4198-42e4-835f-2d0d9cfabf8d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-973896992-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6e163039cafd4b2880a41ded2e2f7d00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a56c96a-00", "ovs_interfaceid": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:11:43Z,direct_url=,disk_format='qcow2',id=6d0fc2e0-41f4-457d-aa83-7dd6fd114687,min_disk=0,min_ram=0,name='',owner='26ca23f4ae694d6eb392b4149eac642e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:11:46Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'scsi', 'device_name': '/dev/sda', 'boot_index': 0, 'encryption_options': None, 'image_id': '6d0fc2e0-41f4-457d-aa83-7dd6fd114687'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:11:56 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:56 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:11:43Z,direct_url=,disk_format='qcow2',id=6d0fc2e0-41f4-457d-aa83-7dd6fd114687,min_disk=0,min_ram=0,name='',owner='26ca23f4ae694d6eb392b4149eac642e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:11:46Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-24T00:11:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1292260913',display_name='tempest-AttachSCSIVolumeTestJSON-server-1292260913',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1292260913',id=8,image_ref='6d0fc2e0-41f4-457d-aa83-7dd6fd114687',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP/vhtnDrSUbhDcnzbj2hTMPnSPvEi9MH767Q1zBuBu7Q388dTCb4/K/XteUMBOM3VV8UZ23HJ9k+WdLhWp/wu0OUK0FLVdByp/HDgDXNkveo1WfqdtyNUzgJX99Y7XXPw==',key_name='tempest-keypair-1524659291',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e163039cafd4b2880a41ded2e2f7d00',ramdisk_id='',reservation_id='r-trgx5ogp',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6d0fc2e0-41f4-457d-aa83-7dd6fd114687',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1166211293',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1166211293-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='04625ea29ba641fc8342441f61274d4f',uuid=5e7bfc8d-7d4a-42f7-9657-cc65e1364b87,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "address": "fa:16:3e:04:8c:34", "network": {"id": "60f29626-4198-42e4-835f-2d0d9cfabf8d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-973896992-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6e163039cafd4b2880a41ded2e2f7d00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a56c96a-00", "ovs_interfaceid": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Converting VIF {"id": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "address": "fa:16:3e:04:8c:34", "network": {"id": "60f29626-4198-42e4-835f-2d0d9cfabf8d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-973896992-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6e163039cafd4b2880a41ded2e2f7d00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a56c96a-00", "ovs_interfaceid": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:8c:34,bridge_name='br-int',has_traffic_filtering=True,id=5a56c96a-0083-47e5-819c-1d802bbcd6ea,network=Network(60f29626-4198-42e4-835f-2d0d9cfabf8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a56c96a-00') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.objects.instance [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lazy-loading 'pci_devices' on Instance uuid 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] End _get_guest_xml xml= Apr 24 00:11:56 user nova-compute[71205]: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87 Apr 24 00:11:56 user nova-compute[71205]: instance-00000008 Apr 24 00:11:56 user nova-compute[71205]: 131072 Apr 24 00:11:56 user nova-compute[71205]: 1 Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: tempest-AttachSCSIVolumeTestJSON-server-1292260913 Apr 24 00:11:56 user nova-compute[71205]: 2023-04-24 00:11:56 Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: 128 Apr 24 00:11:56 user nova-compute[71205]: 1 Apr 24 00:11:56 user nova-compute[71205]: 0 Apr 24 00:11:56 user nova-compute[71205]: 0 Apr 24 00:11:56 user nova-compute[71205]: 1 Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: tempest-AttachSCSIVolumeTestJSON-1166211293-project-member Apr 24 00:11:56 user nova-compute[71205]: tempest-AttachSCSIVolumeTestJSON-1166211293 Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: OpenStack Foundation Apr 24 00:11:56 user nova-compute[71205]: OpenStack Nova Apr 24 00:11:56 user nova-compute[71205]: 0.0.0 Apr 24 00:11:56 user nova-compute[71205]: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87 Apr 24 00:11:56 user nova-compute[71205]: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87 Apr 24 00:11:56 user nova-compute[71205]: Virtual Machine Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: hvm Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Nehalem Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]:
Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]:
Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: /dev/urandom Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: Apr 24 00:11:56 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-24T00:11:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1292260913',display_name='tempest-AttachSCSIVolumeTestJSON-server-1292260913',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1292260913',id=8,image_ref='6d0fc2e0-41f4-457d-aa83-7dd6fd114687',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP/vhtnDrSUbhDcnzbj2hTMPnSPvEi9MH767Q1zBuBu7Q388dTCb4/K/XteUMBOM3VV8UZ23HJ9k+WdLhWp/wu0OUK0FLVdByp/HDgDXNkveo1WfqdtyNUzgJX99Y7XXPw==',key_name='tempest-keypair-1524659291',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e163039cafd4b2880a41ded2e2f7d00',ramdisk_id='',reservation_id='r-trgx5ogp',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6d0fc2e0-41f4-457d-aa83-7dd6fd114687',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1166211293',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1166211293-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:11:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='04625ea29ba641fc8342441f61274d4f',uuid=5e7bfc8d-7d4a-42f7-9657-cc65e1364b87,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "address": "fa:16:3e:04:8c:34", "network": {"id": "60f29626-4198-42e4-835f-2d0d9cfabf8d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-973896992-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6e163039cafd4b2880a41ded2e2f7d00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a56c96a-00", "ovs_interfaceid": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Converting VIF {"id": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "address": "fa:16:3e:04:8c:34", "network": {"id": "60f29626-4198-42e4-835f-2d0d9cfabf8d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-973896992-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6e163039cafd4b2880a41ded2e2f7d00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a56c96a-00", "ovs_interfaceid": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:8c:34,bridge_name='br-int',has_traffic_filtering=True,id=5a56c96a-0083-47e5-819c-1d802bbcd6ea,network=Network(60f29626-4198-42e4-835f-2d0d9cfabf8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a56c96a-00') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG os_vif [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:8c:34,bridge_name='br-int',has_traffic_filtering=True,id=5a56c96a-0083-47e5-819c-1d802bbcd6ea,network=Network(60f29626-4198-42e4-835f-2d0d9cfabf8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a56c96a-00') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5a56c96a-00, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5a56c96a-00, col_values=(('external_ids', {'iface-id': '5a56c96a-0083-47e5-819c-1d802bbcd6ea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:8c:34', 'vm-uuid': '5e7bfc8d-7d4a-42f7-9657-cc65e1364b87'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:56 user nova-compute[71205]: INFO os_vif [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:8c:34,bridge_name='br-int',has_traffic_filtering=True,id=5a56c96a-0083-47e5-819c-1d802bbcd6ea,network=Network(60f29626-4198-42e4-835f-2d0d9cfabf8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a56c96a-00') Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] No BDM found with device name sda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] No BDM found with device name sdb, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] No VIF found with MAC fa:16:3e:04:8c:34, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:11:56 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Using config drive Apr 24 00:11:56 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Creating config drive at /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk.config Apr 24 00:11:56 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpnvj0dblk {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpnvj0dblk" returned: 0 in 0.053s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.network.neutron [req-f1c74533-bae0-4ba6-a55c-a09af34fbce4 req-3a7190ba-fb51-4f3e-8613-d5b46cb9d094 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Updated VIF entry in instance network info cache for port 5a56c96a-0083-47e5-819c-1d802bbcd6ea. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.network.neutron [req-f1c74533-bae0-4ba6-a55c-a09af34fbce4 req-3a7190ba-fb51-4f3e-8613-d5b46cb9d094 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Updating instance_info_cache with network_info: [{"id": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "address": "fa:16:3e:04:8c:34", "network": {"id": "60f29626-4198-42e4-835f-2d0d9cfabf8d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-973896992-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6e163039cafd4b2880a41ded2e2f7d00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a56c96a-00", "ovs_interfaceid": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f1c74533-bae0-4ba6-a55c-a09af34fbce4 req-3a7190ba-fb51-4f3e-8613-d5b46cb9d094 service nova] Releasing lock "refresh_cache-5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:56 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] VM Resumed (Lifecycle Event) Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.compute.manager [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:11:56 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Instance spawned successfully. Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:11:57 user nova-compute[71205]: DEBUG nova.compute.manager [req-4550e69c-213f-4487-8695-925fa6ca820f req-8b93397c-8d2f-401d-8333-38b121ef2917 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Received event network-vif-plugged-d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-4550e69c-213f-4487-8695-925fa6ca820f req-8b93397c-8d2f-401d-8333-38b121ef2917 service nova] Acquiring lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-4550e69c-213f-4487-8695-925fa6ca820f req-8b93397c-8d2f-401d-8333-38b121ef2917 service nova] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-4550e69c-213f-4487-8695-925fa6ca820f req-8b93397c-8d2f-401d-8333-38b121ef2917 service nova] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:57 user nova-compute[71205]: DEBUG nova.compute.manager [req-4550e69c-213f-4487-8695-925fa6ca820f req-8b93397c-8d2f-401d-8333-38b121ef2917 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] No waiting events found dispatching network-vif-plugged-d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:57 user nova-compute[71205]: WARNING nova.compute.manager [req-4550e69c-213f-4487-8695-925fa6ca820f req-8b93397c-8d2f-401d-8333-38b121ef2917 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Received unexpected event network-vif-plugged-d6f98d8d-f918-4aa5-abb0-a34e782f890a for instance with vm_state building and task_state spawning. Apr 24 00:11:57 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:11:57 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:11:57 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] VM Started (Lifecycle Event) Apr 24 00:11:57 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:57 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:11:57 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:11:57 user nova-compute[71205]: INFO nova.compute.manager [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Took 8.50 seconds to spawn the instance on the hypervisor. Apr 24 00:11:57 user nova-compute[71205]: DEBUG nova.compute.manager [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:11:57 user nova-compute[71205]: INFO nova.compute.manager [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Took 9.78 seconds to build instance. Apr 24 00:11:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-7762c2c0-4661-4f90-a7f7-06bb271cb6ed tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.342s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:57 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:58 user nova-compute[71205]: DEBUG nova.compute.manager [req-2c227097-ebfa-45f5-b825-b1eab03acc12 req-eaa44602-55dd-4bf3-82f2-7fbd78b863f8 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Received event network-vif-plugged-5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:11:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-2c227097-ebfa-45f5-b825-b1eab03acc12 req-eaa44602-55dd-4bf3-82f2-7fbd78b863f8 service nova] Acquiring lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:11:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-2c227097-ebfa-45f5-b825-b1eab03acc12 req-eaa44602-55dd-4bf3-82f2-7fbd78b863f8 service nova] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:11:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-2c227097-ebfa-45f5-b825-b1eab03acc12 req-eaa44602-55dd-4bf3-82f2-7fbd78b863f8 service nova] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:11:58 user nova-compute[71205]: DEBUG nova.compute.manager [req-2c227097-ebfa-45f5-b825-b1eab03acc12 req-eaa44602-55dd-4bf3-82f2-7fbd78b863f8 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] No waiting events found dispatching network-vif-plugged-5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:11:58 user nova-compute[71205]: WARNING nova.compute.manager [req-2c227097-ebfa-45f5-b825-b1eab03acc12 req-eaa44602-55dd-4bf3-82f2-7fbd78b863f8 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Received unexpected event network-vif-plugged-5a56c96a-0083-47e5-819c-1d802bbcd6ea for instance with vm_state building and task_state spawning. Apr 24 00:11:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:11:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:12:00 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] VM Resumed (Lifecycle Event) Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.compute.manager [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:12:00 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Instance spawned successfully. Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:12:00 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:12:00 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] VM Started (Lifecycle Event) Apr 24 00:12:00 user nova-compute[71205]: INFO nova.compute.manager [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Took 6.60 seconds to spawn the instance on the hypervisor. Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.compute.manager [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.compute.manager [req-0c3e867b-847c-4fad-806e-c16a969ba401 req-0501d3de-d7da-4b41-9d30-b6626faab8f4 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Received event network-vif-plugged-5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0c3e867b-847c-4fad-806e-c16a969ba401 req-0501d3de-d7da-4b41-9d30-b6626faab8f4 service nova] Acquiring lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0c3e867b-847c-4fad-806e-c16a969ba401 req-0501d3de-d7da-4b41-9d30-b6626faab8f4 service nova] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0c3e867b-847c-4fad-806e-c16a969ba401 req-0501d3de-d7da-4b41-9d30-b6626faab8f4 service nova] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:00 user nova-compute[71205]: DEBUG nova.compute.manager [req-0c3e867b-847c-4fad-806e-c16a969ba401 req-0501d3de-d7da-4b41-9d30-b6626faab8f4 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] No waiting events found dispatching network-vif-plugged-5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:12:00 user nova-compute[71205]: WARNING nova.compute.manager [req-0c3e867b-847c-4fad-806e-c16a969ba401 req-0501d3de-d7da-4b41-9d30-b6626faab8f4 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Received unexpected event network-vif-plugged-5a56c96a-0083-47e5-819c-1d802bbcd6ea for instance with vm_state building and task_state spawning. Apr 24 00:12:00 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:12:00 user nova-compute[71205]: INFO nova.compute.manager [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Took 7.73 seconds to build instance. Apr 24 00:12:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-5a03ce46-d61a-42ad-9986-c4931699b3df tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.851s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:01 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:03 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:12:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Cleaning up deleted instances {{(pid=71205) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 24 00:12:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] There are 0 instances to clean {{(pid=71205) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 24 00:12:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:12:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Cleaning up deleted instances with incomplete migration {{(pid=71205) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 24 00:12:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:12:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:09 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:12:09 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:12:10 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:12:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "ac38bbc2-2229-4497-b501-e9230ec59a32" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:12:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:10 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:12:10 user nova-compute[71205]: INFO nova.compute.claims [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Claim successful on node user Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.407s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.network.neutron [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:12:11 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Skipping network cache update for instance because it is Building. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9805}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.policy [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '514ecffec8034d60ae3c00ecd1ef5c8b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5cff0cbf3a5c4a4aadb3399a31adff0d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:12:11 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Creating image(s) Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "/opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "/opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "/opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-dce8722e-982a-458a-9efb-59d08a5717c7" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-dce8722e-982a-458a-9efb-59d08a5717c7" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lazy-loading 'info_cache' on Instance uuid dce8722e-982a-458a-9efb-59d08a5717c7 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.145s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "f1a14b79-7792-4962-bbe1-ec11e10e6948" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.131s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk 1073741824" returned: 0 in 0.069s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.210s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:12:11 user nova-compute[71205]: INFO nova.compute.claims [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Claim successful on node user Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.173s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Checking if we can resize image /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:12:11 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json" returned: 0 in 0.208s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Cannot resize image /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.objects.instance [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lazy-loading 'migration_context' on Instance uuid ac38bbc2-2229-4497-b501-e9230ec59a32 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Ensure instance console log exists: /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.network.neutron [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Successfully created port: b716ea04-c5e7-43fc-9f20-5ecb011d6385 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Updating instance_info_cache with network_info: [{"id": "caba6f96-07db-411c-a38b-86be3bb1c71a", "address": "fa:16:3e:f9:0a:e2", "network": {"id": "6bffe37b-19f2-4b15-8425-82fe8d0b0c77", "bridge": "br-int", "label": "tempest-VolumesActionsTest-258539693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "4ebbe37ffda44c76a78244b3928f809b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba6f96-07", "ovs_interfaceid": "caba6f96-07db-411c-a38b-86be3bb1c71a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-dce8722e-982a-458a-9efb-59d08a5717c7" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.631s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.221s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.network.neutron [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:12:12 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.policy [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '514ecffec8034d60ae3c00ecd1ef5c8b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5cff0cbf3a5c4a4aadb3399a31adff0d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:12:12 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Creating image(s) Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "/opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "/opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "/opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/disk --force-share --output=json" returned: 0 in 0.185s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.147s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7/disk --force-share --output=json" returned: 0 in 0.157s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.network.neutron [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Successfully updated port: b716ea04-c5e7-43fc-9f20-5ecb011d6385 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.147s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "refresh_cache-ac38bbc2-2229-4497-b501-e9230ec59a32" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquired lock "refresh_cache-ac38bbc2-2229-4497-b501-e9230ec59a32" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.network.neutron [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.compute.manager [req-08e0c9ea-c445-48a1-a48d-18527776ce5d req-1ce047e6-8feb-437f-9b11-6e9da362b71e service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Received event network-changed-b716ea04-c5e7-43fc-9f20-5ecb011d6385 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.compute.manager [req-08e0c9ea-c445-48a1-a48d-18527776ce5d req-1ce047e6-8feb-437f-9b11-6e9da362b71e service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Refreshing instance network info cache due to event network-changed-b716ea04-c5e7-43fc-9f20-5ecb011d6385. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-08e0c9ea-c445-48a1-a48d-18527776ce5d req-1ce047e6-8feb-437f-9b11-6e9da362b71e service nova] Acquiring lock "refresh_cache-ac38bbc2-2229-4497-b501-e9230ec59a32" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk 1073741824" returned: 0 in 0.054s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.210s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/disk --force-share --output=json" returned: 0 in 0.178s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.network.neutron [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.139s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.network.neutron [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Successfully created port: fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d/disk --force-share --output=json" returned: 0 in 0.180s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Cannot resize image /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.objects.instance [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lazy-loading 'migration_context' on Instance uuid f1a14b79-7792-4962-bbe1-ec11e10e6948 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Ensure instance console log exists: /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.network.neutron [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Updating instance_info_cache with network_info: [{"id": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "address": "fa:16:3e:02:81:d9", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb716ea04-c5", "ovs_interfaceid": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Releasing lock "refresh_cache-ac38bbc2-2229-4497-b501-e9230ec59a32" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.compute.manager [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Instance network_info: |[{"id": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "address": "fa:16:3e:02:81:d9", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb716ea04-c5", "ovs_interfaceid": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-08e0c9ea-c445-48a1-a48d-18527776ce5d req-1ce047e6-8feb-437f-9b11-6e9da362b71e service nova] Acquired lock "refresh_cache-ac38bbc2-2229-4497-b501-e9230ec59a32" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.network.neutron [req-08e0c9ea-c445-48a1-a48d-18527776ce5d req-1ce047e6-8feb-437f-9b11-6e9da362b71e service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Refreshing network info cache for port b716ea04-c5e7-43fc-9f20-5ecb011d6385 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Start _get_guest_xml network_info=[{"id": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "address": "fa:16:3e:02:81:d9", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb716ea04-c5", "ovs_interfaceid": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:13 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:12:13 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:12:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-891210301',display_name='tempest-ServerRescueNegativeTestJSON-server-891210301',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-891210301',id=9,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5cff0cbf3a5c4a4aadb3399a31adff0d',ramdisk_id='',reservation_id='r-uhn0foxp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-487575741',owner_user_name='tempest-ServerRescueNegativeTestJSON-487575741-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:12:11Z,user_data=None,user_id='514ecffec8034d60ae3c00ecd1ef5c8b',uuid=ac38bbc2-2229-4497-b501-e9230ec59a32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "address": "fa:16:3e:02:81:d9", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb716ea04-c5", "ovs_interfaceid": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converting VIF {"id": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "address": "fa:16:3e:02:81:d9", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb716ea04-c5", "ovs_interfaceid": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:81:d9,bridge_name='br-int',has_traffic_filtering=True,id=b716ea04-c5e7-43fc-9f20-5ecb011d6385,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb716ea04-c5') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.objects.instance [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lazy-loading 'pci_devices' on Instance uuid ac38bbc2-2229-4497-b501-e9230ec59a32 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] End _get_guest_xml xml= Apr 24 00:12:13 user nova-compute[71205]: ac38bbc2-2229-4497-b501-e9230ec59a32 Apr 24 00:12:13 user nova-compute[71205]: instance-00000009 Apr 24 00:12:13 user nova-compute[71205]: 131072 Apr 24 00:12:13 user nova-compute[71205]: 1 Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: tempest-ServerRescueNegativeTestJSON-server-891210301 Apr 24 00:12:13 user nova-compute[71205]: 2023-04-24 00:12:13 Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: 128 Apr 24 00:12:13 user nova-compute[71205]: 1 Apr 24 00:12:13 user nova-compute[71205]: 0 Apr 24 00:12:13 user nova-compute[71205]: 0 Apr 24 00:12:13 user nova-compute[71205]: 1 Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: tempest-ServerRescueNegativeTestJSON-487575741-project-member Apr 24 00:12:13 user nova-compute[71205]: tempest-ServerRescueNegativeTestJSON-487575741 Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: OpenStack Foundation Apr 24 00:12:13 user nova-compute[71205]: OpenStack Nova Apr 24 00:12:13 user nova-compute[71205]: 0.0.0 Apr 24 00:12:13 user nova-compute[71205]: ac38bbc2-2229-4497-b501-e9230ec59a32 Apr 24 00:12:13 user nova-compute[71205]: ac38bbc2-2229-4497-b501-e9230ec59a32 Apr 24 00:12:13 user nova-compute[71205]: Virtual Machine Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: hvm Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Nehalem Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: /dev/urandom Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: Apr 24 00:12:13 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:12:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-891210301',display_name='tempest-ServerRescueNegativeTestJSON-server-891210301',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-891210301',id=9,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5cff0cbf3a5c4a4aadb3399a31adff0d',ramdisk_id='',reservation_id='r-uhn0foxp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-487575741',owner_user_name='tempest-ServerRescueNegativeTestJSON-487575741-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:12:11Z,user_data=None,user_id='514ecffec8034d60ae3c00ecd1ef5c8b',uuid=ac38bbc2-2229-4497-b501-e9230ec59a32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "address": "fa:16:3e:02:81:d9", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb716ea04-c5", "ovs_interfaceid": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converting VIF {"id": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "address": "fa:16:3e:02:81:d9", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb716ea04-c5", "ovs_interfaceid": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:81:d9,bridge_name='br-int',has_traffic_filtering=True,id=b716ea04-c5e7-43fc-9f20-5ecb011d6385,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb716ea04-c5') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG os_vif [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:81:d9,bridge_name='br-int',has_traffic_filtering=True,id=b716ea04-c5e7-43fc-9f20-5ecb011d6385,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb716ea04-c5') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb716ea04-c5, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb716ea04-c5, col_values=(('external_ids', {'iface-id': 'b716ea04-c5e7-43fc-9f20-5ecb011d6385', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:81:d9', 'vm-uuid': 'ac38bbc2-2229-4497-b501-e9230ec59a32'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:12:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:13 user nova-compute[71205]: INFO os_vif [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:81:d9,bridge_name='br-int',has_traffic_filtering=True,id=b716ea04-c5e7-43fc-9f20-5ecb011d6385,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb716ea04-c5') Apr 24 00:12:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json" returned: 0 in 0.156s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] No VIF found with MAC fa:16:3e:02:81:d9, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.network.neutron [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Successfully updated port: fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "refresh_cache-f1a14b79-7792-4962-bbe1-ec11e10e6948" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquired lock "refresh_cache-f1a14b79-7792-4962-bbe1-ec11e10e6948" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.network.neutron [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.compute.manager [req-49c4dace-4c5e-4990-8b74-2445ce96d93e req-96b1b79e-df00-4186-a02d-7cc684a72bfa service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received event network-changed-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.compute.manager [req-49c4dace-4c5e-4990-8b74-2445ce96d93e req-96b1b79e-df00-4186-a02d-7cc684a72bfa service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Refreshing instance network info cache due to event network-changed-fbd4aa76-4861-41fe-a0dc-5dee747b2517. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-49c4dace-4c5e-4990-8b74-2445ce96d93e req-96b1b79e-df00-4186-a02d-7cc684a72bfa service nova] Acquiring lock "refresh_cache-f1a14b79-7792-4962-bbe1-ec11e10e6948" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.network.neutron [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.network.neutron [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Updating instance_info_cache with network_info: [{"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Releasing lock "refresh_cache-f1a14b79-7792-4962-bbe1-ec11e10e6948" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.compute.manager [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Instance network_info: |[{"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-49c4dace-4c5e-4990-8b74-2445ce96d93e req-96b1b79e-df00-4186-a02d-7cc684a72bfa service nova] Acquired lock "refresh_cache-f1a14b79-7792-4962-bbe1-ec11e10e6948" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.network.neutron [req-49c4dace-4c5e-4990-8b74-2445ce96d93e req-96b1b79e-df00-4186-a02d-7cc684a72bfa service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Refreshing network info cache for port fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Start _get_guest_xml network_info=[{"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.network.neutron [req-08e0c9ea-c445-48a1-a48d-18527776ce5d req-1ce047e6-8feb-437f-9b11-6e9da362b71e service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Updated VIF entry in instance network info cache for port b716ea04-c5e7-43fc-9f20-5ecb011d6385. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.network.neutron [req-08e0c9ea-c445-48a1-a48d-18527776ce5d req-1ce047e6-8feb-437f-9b11-6e9da362b71e service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Updating instance_info_cache with network_info: [{"id": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "address": "fa:16:3e:02:81:d9", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb716ea04-c5", "ovs_interfaceid": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:12:14 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:12:14 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:12:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-265518045',display_name='tempest-ServerRescueNegativeTestJSON-server-265518045',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-265518045',id=10,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5cff0cbf3a5c4a4aadb3399a31adff0d',ramdisk_id='',reservation_id='r-xi2sy34x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-487575741',owner_user_name='tempest-ServerRescueNegativeTestJSON-487575741-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:12:13Z,user_data=None,user_id='514ecffec8034d60ae3c00ecd1ef5c8b',uuid=f1a14b79-7792-4962-bbe1-ec11e10e6948,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converting VIF {"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:34:0d,bridge_name='br-int',has_traffic_filtering=True,id=fbd4aa76-4861-41fe-a0dc-5dee747b2517,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbd4aa76-48') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.objects.instance [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lazy-loading 'pci_devices' on Instance uuid f1a14b79-7792-4962-bbe1-ec11e10e6948 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk --force-share --output=json" returned: 0 in 0.219s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-08e0c9ea-c445-48a1-a48d-18527776ce5d req-1ce047e6-8feb-437f-9b11-6e9da362b71e service nova] Releasing lock "refresh_cache-ac38bbc2-2229-4497-b501-e9230ec59a32" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] End _get_guest_xml xml= Apr 24 00:12:14 user nova-compute[71205]: f1a14b79-7792-4962-bbe1-ec11e10e6948 Apr 24 00:12:14 user nova-compute[71205]: instance-0000000a Apr 24 00:12:14 user nova-compute[71205]: 131072 Apr 24 00:12:14 user nova-compute[71205]: 1 Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: tempest-ServerRescueNegativeTestJSON-server-265518045 Apr 24 00:12:14 user nova-compute[71205]: 2023-04-24 00:12:14 Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: 128 Apr 24 00:12:14 user nova-compute[71205]: 1 Apr 24 00:12:14 user nova-compute[71205]: 0 Apr 24 00:12:14 user nova-compute[71205]: 0 Apr 24 00:12:14 user nova-compute[71205]: 1 Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: tempest-ServerRescueNegativeTestJSON-487575741-project-member Apr 24 00:12:14 user nova-compute[71205]: tempest-ServerRescueNegativeTestJSON-487575741 Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: OpenStack Foundation Apr 24 00:12:14 user nova-compute[71205]: OpenStack Nova Apr 24 00:12:14 user nova-compute[71205]: 0.0.0 Apr 24 00:12:14 user nova-compute[71205]: f1a14b79-7792-4962-bbe1-ec11e10e6948 Apr 24 00:12:14 user nova-compute[71205]: f1a14b79-7792-4962-bbe1-ec11e10e6948 Apr 24 00:12:14 user nova-compute[71205]: Virtual Machine Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: hvm Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Nehalem Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: /dev/urandom Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: Apr 24 00:12:14 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:12:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-265518045',display_name='tempest-ServerRescueNegativeTestJSON-server-265518045',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-265518045',id=10,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5cff0cbf3a5c4a4aadb3399a31adff0d',ramdisk_id='',reservation_id='r-xi2sy34x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-487575741',owner_user_name='tempest-ServerRescueNegativeTestJSON-487575741-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:12:13Z,user_data=None,user_id='514ecffec8034d60ae3c00ecd1ef5c8b',uuid=f1a14b79-7792-4962-bbe1-ec11e10e6948,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converting VIF {"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:34:0d,bridge_name='br-int',has_traffic_filtering=True,id=fbd4aa76-4861-41fe-a0dc-5dee747b2517,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbd4aa76-48') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG os_vif [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:34:0d,bridge_name='br-int',has_traffic_filtering=True,id=fbd4aa76-4861-41fe-a0dc-5dee747b2517,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbd4aa76-48') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbd4aa76-48, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfbd4aa76-48, col_values=(('external_ids', {'iface-id': 'fbd4aa76-4861-41fe-a0dc-5dee747b2517', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:33:34:0d', 'vm-uuid': 'f1a14b79-7792-4962-bbe1-ec11e10e6948'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:14 user nova-compute[71205]: INFO os_vif [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:34:0d,bridge_name='br-int',has_traffic_filtering=True,id=fbd4aa76-4861-41fe-a0dc-5dee747b2517,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbd4aa76-48') Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:12:14 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] No VIF found with MAC fa:16:3e:33:34:0d, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:12:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:12:15 user nova-compute[71205]: DEBUG nova.network.neutron [req-49c4dace-4c5e-4990-8b74-2445ce96d93e req-96b1b79e-df00-4186-a02d-7cc684a72bfa service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Updated VIF entry in instance network info cache for port fbd4aa76-4861-41fe-a0dc-5dee747b2517. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:12:15 user nova-compute[71205]: DEBUG nova.network.neutron [req-49c4dace-4c5e-4990-8b74-2445ce96d93e req-96b1b79e-df00-4186-a02d-7cc684a72bfa service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Updating instance_info_cache with network_info: [{"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:12:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-49c4dace-4c5e-4990-8b74-2445ce96d93e req-96b1b79e-df00-4186-a02d-7cc684a72bfa service nova] Releasing lock "refresh_cache-f1a14b79-7792-4962-bbe1-ec11e10e6948" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:12:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:12:19 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] VM Resumed (Lifecycle Event) Apr 24 00:12:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-eb7e43c1-ae23-4dbc-84a2-e240beb3fa87 req-308b830c-d8b1-435d-812e-8736beda8cf8 service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Received event network-vif-plugged-b716ea04-c5e7-43fc-9f20-5ecb011d6385 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-eb7e43c1-ae23-4dbc-84a2-e240beb3fa87 req-308b830c-d8b1-435d-812e-8736beda8cf8 service nova] Acquiring lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-eb7e43c1-ae23-4dbc-84a2-e240beb3fa87 req-308b830c-d8b1-435d-812e-8736beda8cf8 service nova] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-eb7e43c1-ae23-4dbc-84a2-e240beb3fa87 req-308b830c-d8b1-435d-812e-8736beda8cf8 service nova] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-eb7e43c1-ae23-4dbc-84a2-e240beb3fa87 req-308b830c-d8b1-435d-812e-8736beda8cf8 service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] No waiting events found dispatching network-vif-plugged-b716ea04-c5e7-43fc-9f20-5ecb011d6385 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:12:19 user nova-compute[71205]: WARNING nova.compute.manager [req-eb7e43c1-ae23-4dbc-84a2-e240beb3fa87 req-308b830c-d8b1-435d-812e-8736beda8cf8 service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Received unexpected event network-vif-plugged-b716ea04-c5e7-43fc-9f20-5ecb011d6385 for instance with vm_state building and task_state spawning. Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-d6be5273-5b5a-426a-88a6-6a220fcc4807 req-1b103b23-d02b-4e8d-8d29-da1886f561ea service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received event network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d6be5273-5b5a-426a-88a6-6a220fcc4807 req-1b103b23-d02b-4e8d-8d29-da1886f561ea service nova] Acquiring lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d6be5273-5b5a-426a-88a6-6a220fcc4807 req-1b103b23-d02b-4e8d-8d29-da1886f561ea service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d6be5273-5b5a-426a-88a6-6a220fcc4807 req-1b103b23-d02b-4e8d-8d29-da1886f561ea service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-d6be5273-5b5a-426a-88a6-6a220fcc4807 req-1b103b23-d02b-4e8d-8d29-da1886f561ea service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] No waiting events found dispatching network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:12:19 user nova-compute[71205]: WARNING nova.compute.manager [req-d6be5273-5b5a-426a-88a6-6a220fcc4807 req-1b103b23-d02b-4e8d-8d29-da1886f561ea service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received unexpected event network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 for instance with vm_state building and task_state spawning. Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-ff8ceb22-c374-483e-82d4-8889a00a8242 req-2f305c1c-636f-423a-a17d-e43ef4501b2c service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Received event network-vif-plugged-b716ea04-c5e7-43fc-9f20-5ecb011d6385 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ff8ceb22-c374-483e-82d4-8889a00a8242 req-2f305c1c-636f-423a-a17d-e43ef4501b2c service nova] Acquiring lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ff8ceb22-c374-483e-82d4-8889a00a8242 req-2f305c1c-636f-423a-a17d-e43ef4501b2c service nova] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ff8ceb22-c374-483e-82d4-8889a00a8242 req-2f305c1c-636f-423a-a17d-e43ef4501b2c service nova] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-ff8ceb22-c374-483e-82d4-8889a00a8242 req-2f305c1c-636f-423a-a17d-e43ef4501b2c service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] No waiting events found dispatching network-vif-plugged-b716ea04-c5e7-43fc-9f20-5ecb011d6385 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:12:19 user nova-compute[71205]: WARNING nova.compute.manager [req-ff8ceb22-c374-483e-82d4-8889a00a8242 req-2f305c1c-636f-423a-a17d-e43ef4501b2c service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Received unexpected event network-vif-plugged-b716ea04-c5e7-43fc-9f20-5ecb011d6385 for instance with vm_state building and task_state spawning. Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-0eccbec9-02a9-4b9b-8dea-302cd96bcab2 req-1fe458b0-1a18-4747-aaa8-0d6b0682b873 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received event network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0eccbec9-02a9-4b9b-8dea-302cd96bcab2 req-1fe458b0-1a18-4747-aaa8-0d6b0682b873 service nova] Acquiring lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0eccbec9-02a9-4b9b-8dea-302cd96bcab2 req-1fe458b0-1a18-4747-aaa8-0d6b0682b873 service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0eccbec9-02a9-4b9b-8dea-302cd96bcab2 req-1fe458b0-1a18-4747-aaa8-0d6b0682b873 service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-0eccbec9-02a9-4b9b-8dea-302cd96bcab2 req-1fe458b0-1a18-4747-aaa8-0d6b0682b873 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] No waiting events found dispatching network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:12:19 user nova-compute[71205]: WARNING nova.compute.manager [req-0eccbec9-02a9-4b9b-8dea-302cd96bcab2 req-1fe458b0-1a18-4747-aaa8-0d6b0682b873 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received unexpected event network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 for instance with vm_state building and task_state spawning. Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:12:19 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Instance spawned successfully. Apr 24 00:12:19 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:12:19 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=7787MB free_disk=26.62063217163086GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:12:19 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Instance spawned successfully. Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:12:19 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:12:19 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] VM Started (Lifecycle Event) Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:19 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:12:19 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] VM Resumed (Lifecycle Event) Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:12:19 user nova-compute[71205]: INFO nova.compute.manager [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Took 8.45 seconds to spawn the instance on the hypervisor. Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:12:19 user nova-compute[71205]: INFO nova.compute.manager [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Took 7.08 seconds to spawn the instance on the hypervisor. Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:12:19 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:12:19 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] VM Started (Lifecycle Event) Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:12:19 user nova-compute[71205]: INFO nova.compute.manager [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Took 8.23 seconds to build instance. Apr 24 00:12:19 user nova-compute[71205]: INFO nova.compute.manager [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Took 9.24 seconds to build instance. Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-53eea3bc-cb3e-4072-942f-8d3d7266ab25 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.344s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a9bf7dd6-a1e5-4542-a146-92f9846b7bc5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.355s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 212a2ad6-77ab-4615-b6ca-f426a3e76ab5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ffbf17ce-e3cb-4099-bea3-6887fef4476d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance dce8722e-982a-458a-9efb-59d08a5717c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ac38bbc2-2229-4497-b501-e9230ec59a32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance f1a14b79-7792-4962-bbe1-ec11e10e6948 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ce19423d-a6ee-4506-9cd1-ec4803abdd86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance e762c863-43e1-4f26-ab6b-c8ea40f08887 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 10 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=1792MB phys_disk=40GB used_disk=10GB total_vcpus=12 used_vcpus=10 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Refreshing inventories for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Updating ProviderTree inventory for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Updating inventory in ProviderTree for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Refreshing aggregate associations for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4, aggregates: None {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Refreshing trait associations for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4, traits: HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_DEVICE_TAGGING {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.846s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:12:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Acquiring lock "dce8722e-982a-458a-9efb-59d08a5717c7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "dce8722e-982a-458a-9efb-59d08a5717c7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Acquiring lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:55 user nova-compute[71205]: INFO nova.compute.manager [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Terminating instance Apr 24 00:12:55 user nova-compute[71205]: DEBUG nova.compute.manager [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:12:55 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:55 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG nova.compute.manager [req-dd1b9261-37c9-4a30-8167-d60d6af38a56 req-5e6d8c47-3af1-4e18-80b1-0bdf2917bdf7 service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Received event network-vif-unplugged-caba6f96-07db-411c-a38b-86be3bb1c71a {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-dd1b9261-37c9-4a30-8167-d60d6af38a56 req-5e6d8c47-3af1-4e18-80b1-0bdf2917bdf7 service nova] Acquiring lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-dd1b9261-37c9-4a30-8167-d60d6af38a56 req-5e6d8c47-3af1-4e18-80b1-0bdf2917bdf7 service nova] Lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-dd1b9261-37c9-4a30-8167-d60d6af38a56 req-5e6d8c47-3af1-4e18-80b1-0bdf2917bdf7 service nova] Lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG nova.compute.manager [req-dd1b9261-37c9-4a30-8167-d60d6af38a56 req-5e6d8c47-3af1-4e18-80b1-0bdf2917bdf7 service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] No waiting events found dispatching network-vif-unplugged-caba6f96-07db-411c-a38b-86be3bb1c71a {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG nova.compute.manager [req-dd1b9261-37c9-4a30-8167-d60d6af38a56 req-5e6d8c47-3af1-4e18-80b1-0bdf2917bdf7 service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Received event network-vif-unplugged-caba6f96-07db-411c-a38b-86be3bb1c71a for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:56 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Instance destroyed successfully. Apr 24 00:12:56 user nova-compute[71205]: DEBUG nova.objects.instance [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lazy-loading 'resources' on Instance uuid dce8722e-982a-458a-9efb-59d08a5717c7 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-285922937',display_name='tempest-VolumesActionsTest-instance-285922937',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-285922937',id=1,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-24T00:11:18Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='4ebbe37ffda44c76a78244b3928f809b',ramdisk_id='',reservation_id='r-nr0b3uav',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesActionsTest-1689103952',owner_user_name='tempest-VolumesActionsTest-1689103952-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:11:19Z,user_data=None,user_id='31621d02dc5143689c0d8c1280479fdc',uuid=dce8722e-982a-458a-9efb-59d08a5717c7,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "caba6f96-07db-411c-a38b-86be3bb1c71a", "address": "fa:16:3e:f9:0a:e2", "network": {"id": "6bffe37b-19f2-4b15-8425-82fe8d0b0c77", "bridge": "br-int", "label": "tempest-VolumesActionsTest-258539693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "4ebbe37ffda44c76a78244b3928f809b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba6f96-07", "ovs_interfaceid": "caba6f96-07db-411c-a38b-86be3bb1c71a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Converting VIF {"id": "caba6f96-07db-411c-a38b-86be3bb1c71a", "address": "fa:16:3e:f9:0a:e2", "network": {"id": "6bffe37b-19f2-4b15-8425-82fe8d0b0c77", "bridge": "br-int", "label": "tempest-VolumesActionsTest-258539693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "4ebbe37ffda44c76a78244b3928f809b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba6f96-07", "ovs_interfaceid": "caba6f96-07db-411c-a38b-86be3bb1c71a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f9:0a:e2,bridge_name='br-int',has_traffic_filtering=True,id=caba6f96-07db-411c-a38b-86be3bb1c71a,network=Network(6bffe37b-19f2-4b15-8425-82fe8d0b0c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba6f96-07') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG os_vif [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:0a:e2,bridge_name='br-int',has_traffic_filtering=True,id=caba6f96-07db-411c-a38b-86be3bb1c71a,network=Network(6bffe37b-19f2-4b15-8425-82fe8d0b0c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba6f96-07') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcaba6f96-07, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:12:56 user nova-compute[71205]: INFO os_vif [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:0a:e2,bridge_name='br-int',has_traffic_filtering=True,id=caba6f96-07db-411c-a38b-86be3bb1c71a,network=Network(6bffe37b-19f2-4b15-8425-82fe8d0b0c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba6f96-07') Apr 24 00:12:56 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Deleting instance files /opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7_del Apr 24 00:12:56 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Deletion of /opt/stack/data/nova/instances/dce8722e-982a-458a-9efb-59d08a5717c7_del complete Apr 24 00:12:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Checking UEFI support for host arch (x86_64) {{(pid=71205) supports_uefi /opt/stack/nova/nova/virt/libvirt/host.py:1722}} Apr 24 00:12:56 user nova-compute[71205]: INFO nova.virt.libvirt.host [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] UEFI support detected Apr 24 00:12:56 user nova-compute[71205]: INFO nova.compute.manager [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Took 0.69 seconds to destroy the instance on the hypervisor. Apr 24 00:12:56 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:12:56 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:12:57 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:12:57 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Took 0.54 seconds to deallocate network for instance. Apr 24 00:12:57 user nova-compute[71205]: DEBUG nova.compute.manager [req-0109fdef-6dee-41b9-ba96-cf56d5e42bc3 req-dca40480-445d-462b-9de9-8bb64ce97d9f service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Received event network-vif-deleted-caba6f96-07db-411c-a38b-86be3bb1c71a {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:12:57 user nova-compute[71205]: INFO nova.compute.manager [req-0109fdef-6dee-41b9-ba96-cf56d5e42bc3 req-dca40480-445d-462b-9de9-8bb64ce97d9f service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Neutron deleted interface caba6f96-07db-411c-a38b-86be3bb1c71a; detaching it from the instance and deleting it from the info cache Apr 24 00:12:57 user nova-compute[71205]: DEBUG nova.network.neutron [req-0109fdef-6dee-41b9-ba96-cf56d5e42bc3 req-dca40480-445d-462b-9de9-8bb64ce97d9f service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:12:57 user nova-compute[71205]: DEBUG nova.compute.manager [req-0109fdef-6dee-41b9-ba96-cf56d5e42bc3 req-dca40480-445d-462b-9de9-8bb64ce97d9f service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Detach interface failed, port_id=caba6f96-07db-411c-a38b-86be3bb1c71a, reason: Instance dce8722e-982a-458a-9efb-59d08a5717c7 could not be found. {{(pid=71205) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 24 00:12:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:57 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:12:57 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:12:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.366s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:57 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Deleted allocations for instance dce8722e-982a-458a-9efb-59d08a5717c7 Apr 24 00:12:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-83acb784-a137-40b7-84bf-4c5f8a7a149b tempest-VolumesActionsTest-1689103952 tempest-VolumesActionsTest-1689103952-project-member] Lock "dce8722e-982a-458a-9efb-59d08a5717c7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.950s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:58 user nova-compute[71205]: DEBUG nova.compute.manager [req-2f05ee8d-0b6a-49b5-9384-967319cb222b req-c0908fdf-2d26-4c83-960b-4cc1c6a704da service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Received event network-vif-plugged-caba6f96-07db-411c-a38b-86be3bb1c71a {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:12:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-2f05ee8d-0b6a-49b5-9384-967319cb222b req-c0908fdf-2d26-4c83-960b-4cc1c6a704da service nova] Acquiring lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:12:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-2f05ee8d-0b6a-49b5-9384-967319cb222b req-c0908fdf-2d26-4c83-960b-4cc1c6a704da service nova] Lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:12:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-2f05ee8d-0b6a-49b5-9384-967319cb222b req-c0908fdf-2d26-4c83-960b-4cc1c6a704da service nova] Lock "dce8722e-982a-458a-9efb-59d08a5717c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:12:58 user nova-compute[71205]: DEBUG nova.compute.manager [req-2f05ee8d-0b6a-49b5-9384-967319cb222b req-c0908fdf-2d26-4c83-960b-4cc1c6a704da service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] No waiting events found dispatching network-vif-plugged-caba6f96-07db-411c-a38b-86be3bb1c71a {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:12:58 user nova-compute[71205]: WARNING nova.compute.manager [req-2f05ee8d-0b6a-49b5-9384-967319cb222b req-c0908fdf-2d26-4c83-960b-4cc1c6a704da service nova] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Received unexpected event network-vif-plugged-caba6f96-07db-411c-a38b-86be3bb1c71a for instance with vm_state deleted and task_state None. Apr 24 00:12:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:01 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:03 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:09 user nova-compute[71205]: DEBUG nova.compute.manager [req-3101f0a4-10c5-45de-b955-b9f30a80b08b req-7a77b534-9970-4470-84d8-94603a2bda89 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Received event network-changed-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:09 user nova-compute[71205]: DEBUG nova.compute.manager [req-3101f0a4-10c5-45de-b955-b9f30a80b08b req-7a77b534-9970-4470-84d8-94603a2bda89 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Refreshing instance network info cache due to event network-changed-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:13:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3101f0a4-10c5-45de-b955-b9f30a80b08b req-7a77b534-9970-4470-84d8-94603a2bda89 service nova] Acquiring lock "refresh_cache-212a2ad6-77ab-4615-b6ca-f426a3e76ab5" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:13:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3101f0a4-10c5-45de-b955-b9f30a80b08b req-7a77b534-9970-4470-84d8-94603a2bda89 service nova] Acquired lock "refresh_cache-212a2ad6-77ab-4615-b6ca-f426a3e76ab5" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:13:09 user nova-compute[71205]: DEBUG nova.network.neutron [req-3101f0a4-10c5-45de-b955-b9f30a80b08b req-7a77b534-9970-4470-84d8-94603a2bda89 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Refreshing network info cache for port 81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:13:10 user nova-compute[71205]: DEBUG nova.network.neutron [req-3101f0a4-10c5-45de-b955-b9f30a80b08b req-7a77b534-9970-4470-84d8-94603a2bda89 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Updated VIF entry in instance network info cache for port 81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:13:10 user nova-compute[71205]: DEBUG nova.network.neutron [req-3101f0a4-10c5-45de-b955-b9f30a80b08b req-7a77b534-9970-4470-84d8-94603a2bda89 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Updating instance_info_cache with network_info: [{"id": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "address": "fa:16:3e:32:fe:fb", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap81bcd0fd-3b", "ovs_interfaceid": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:13:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3101f0a4-10c5-45de-b955-b9f30a80b08b req-7a77b534-9970-4470-84d8-94603a2bda89 service nova] Releasing lock "refresh_cache-212a2ad6-77ab-4615-b6ca-f426a3e76ab5" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:13:10 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:13:10 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:13:10 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:13:11 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] VM Stopped (Lifecycle Event) Apr 24 00:13:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-4313770c-d259-42fe-ac1c-f735d2687874 None None] [instance: dce8722e-982a-458a-9efb-59d08a5717c7] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:11 user nova-compute[71205]: INFO nova.compute.manager [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Terminating instance Apr 24 00:13:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG nova.compute.manager [req-bd556249-25e8-4d5e-b3ef-8469b711785e req-79ea5f29-e4a7-4128-be24-c508644a526a service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Received event network-vif-unplugged-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bd556249-25e8-4d5e-b3ef-8469b711785e req-79ea5f29-e4a7-4128-be24-c508644a526a service nova] Acquiring lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bd556249-25e8-4d5e-b3ef-8469b711785e req-79ea5f29-e4a7-4128-be24-c508644a526a service nova] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bd556249-25e8-4d5e-b3ef-8469b711785e req-79ea5f29-e4a7-4128-be24-c508644a526a service nova] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG nova.compute.manager [req-bd556249-25e8-4d5e-b3ef-8469b711785e req-79ea5f29-e4a7-4128-be24-c508644a526a service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] No waiting events found dispatching network-vif-unplugged-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG nova.compute.manager [req-bd556249-25e8-4d5e-b3ef-8469b711785e req-79ea5f29-e4a7-4128-be24-c508644a526a service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Received event network-vif-unplugged-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Acquiring lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Acquiring lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:11 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:11 user nova-compute[71205]: INFO nova.compute.manager [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Terminating instance Apr 24 00:13:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.compute.manager [req-646ac53d-8e0c-43c2-92a0-1fe0bf0a7595 req-c89b9a24-95cc-416a-a068-a487d78771e6 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received event network-vif-unplugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-646ac53d-8e0c-43c2-92a0-1fe0bf0a7595 req-c89b9a24-95cc-416a-a068-a487d78771e6 service nova] Acquiring lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-646ac53d-8e0c-43c2-92a0-1fe0bf0a7595 req-c89b9a24-95cc-416a-a068-a487d78771e6 service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-646ac53d-8e0c-43c2-92a0-1fe0bf0a7595 req-c89b9a24-95cc-416a-a068-a487d78771e6 service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.compute.manager [req-646ac53d-8e0c-43c2-92a0-1fe0bf0a7595 req-c89b9a24-95cc-416a-a068-a487d78771e6 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] No waiting events found dispatching network-vif-unplugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.compute.manager [req-646ac53d-8e0c-43c2-92a0-1fe0bf0a7595 req-c89b9a24-95cc-416a-a068-a487d78771e6 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received event network-vif-unplugged-be0c060b-e1fe-496e-8827-a2699e8a4017 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Skipping network cache update for instance because it is being deleted. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9841}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Skipping network cache update for instance because it is being deleted. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9841}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-ce19423d-a6ee-4506-9cd1-ec4803abdd86" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-ce19423d-a6ee-4506-9cd1-ec4803abdd86" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Instance destroyed successfully. Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.objects.instance [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lazy-loading 'resources' on Instance uuid 212a2ad6-77ab-4615-b6ca-f426a3e76ab5 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:13:12 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Instance destroyed successfully. Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.objects.instance [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lazy-loading 'resources' on Instance uuid ffbf17ce-e3cb-4099-bea3-6887fef4476d {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1695303135',display_name='tempest-AttachVolumeTestJSON-server-1695303135',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1695303135',id=2,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCP71ilDAgiarKoZWp2VpgncCzSb29zFpexe4Gow4OMeIBbuSeM19Qy9FpbyZ23mx7wcJNC4TUUIImLZa0Jkxw/4VzByhN1LXhR6rqRIHWomLMjZmJ53RbDWMdufdl+oQ==',key_name='tempest-keypair-1814203803',keypairs=,launch_index=0,launched_at=2023-04-24T00:11:31Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1d063c2bdc884fb8b826b9fb6fd97405',ramdisk_id='',reservation_id='r-dagqmnao',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-1425553791',owner_user_name='tempest-AttachVolumeTestJSON-1425553791-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:11:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='471d341199f0431a95ae54651c4f0780',uuid=212a2ad6-77ab-4615-b6ca-f426a3e76ab5,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "address": "fa:16:3e:32:fe:fb", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap81bcd0fd-3b", "ovs_interfaceid": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Converting VIF {"id": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "address": "fa:16:3e:32:fe:fb", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap81bcd0fd-3b", "ovs_interfaceid": "81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:32:fe:fb,bridge_name='br-int',has_traffic_filtering=True,id=81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81bcd0fd-3b') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG os_vif [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:32:fe:fb,bridge_name='br-int',has_traffic_filtering=True,id=81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81bcd0fd-3b') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81bcd0fd-3b, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-437284520',display_name='tempest-DeleteServersTestJSON-server-437284520',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-437284520',id=3,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-24T00:11:37Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='218aba2df07b4afaa999399d0981e6bf',ramdisk_id='',reservation_id='r-v6rykihc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-DeleteServersTestJSON-1102950130',owner_user_name='tempest-DeleteServersTestJSON-1102950130-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:11:38Z,user_data=None,user_id='9cf55386d2654a1e86af6882a2aed860',uuid=ffbf17ce-e3cb-4099-bea3-6887fef4476d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "be0c060b-e1fe-496e-8827-a2699e8a4017", "address": "fa:16:3e:d3:61:2a", "network": {"id": "431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-800590822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "218aba2df07b4afaa999399d0981e6bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe0c060b-e1", "ovs_interfaceid": "be0c060b-e1fe-496e-8827-a2699e8a4017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Converting VIF {"id": "be0c060b-e1fe-496e-8827-a2699e8a4017", "address": "fa:16:3e:d3:61:2a", "network": {"id": "431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-800590822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "218aba2df07b4afaa999399d0981e6bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe0c060b-e1", "ovs_interfaceid": "be0c060b-e1fe-496e-8827-a2699e8a4017", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:61:2a,bridge_name='br-int',has_traffic_filtering=True,id=be0c060b-e1fe-496e-8827-a2699e8a4017,network=Network(431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe0c060b-e1') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG os_vif [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:61:2a,bridge_name='br-int',has_traffic_filtering=True,id=be0c060b-e1fe-496e-8827-a2699e8a4017,network=Network(431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe0c060b-e1') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe0c060b-e1, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:13:12 user nova-compute[71205]: INFO os_vif [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:32:fe:fb,bridge_name='br-int',has_traffic_filtering=True,id=81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81bcd0fd-3b') Apr 24 00:13:12 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Deleting instance files /opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5_del Apr 24 00:13:12 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Deletion of /opt/stack/data/nova/instances/212a2ad6-77ab-4615-b6ca-f426a3e76ab5_del complete Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:13:12 user nova-compute[71205]: INFO os_vif [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:61:2a,bridge_name='br-int',has_traffic_filtering=True,id=be0c060b-e1fe-496e-8827-a2699e8a4017,network=Network(431c37bf-e6d7-4aa7-9081-a8c7bb5d5f3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe0c060b-e1') Apr 24 00:13:12 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Deleting instance files /opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d_del Apr 24 00:13:12 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Deletion of /opt/stack/data/nova/instances/ffbf17ce-e3cb-4099-bea3-6887fef4476d_del complete Apr 24 00:13:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:12 user nova-compute[71205]: INFO nova.compute.manager [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Took 1.30 seconds to destroy the instance on the hypervisor. Apr 24 00:13:12 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:13:12 user nova-compute[71205]: INFO nova.compute.manager [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Took 0.95 seconds to destroy the instance on the hypervisor. Apr 24 00:13:12 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:13:12 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Updating instance_info_cache with network_info: [{"id": "f6c98734-17ee-42d0-9372-fd76526b0b27", "address": "fa:16:3e:72:c7:21", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6c98734-17", "ovs_interfaceid": "f6c98734-17ee-42d0-9372-fd76526b0b27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-ce19423d-a6ee-4506-9cd1-ec4803abdd86" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG nova.compute.manager [req-3991a5ec-9bb3-4ba8-8466-58fcc5f19574 req-1ab1a88e-d85b-4fdf-a8cb-eafcc59e7439 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Received event network-vif-plugged-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3991a5ec-9bb3-4ba8-8466-58fcc5f19574 req-1ab1a88e-d85b-4fdf-a8cb-eafcc59e7439 service nova] Acquiring lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3991a5ec-9bb3-4ba8-8466-58fcc5f19574 req-1ab1a88e-d85b-4fdf-a8cb-eafcc59e7439 service nova] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3991a5ec-9bb3-4ba8-8466-58fcc5f19574 req-1ab1a88e-d85b-4fdf-a8cb-eafcc59e7439 service nova] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG nova.compute.manager [req-3991a5ec-9bb3-4ba8-8466-58fcc5f19574 req-1ab1a88e-d85b-4fdf-a8cb-eafcc59e7439 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] No waiting events found dispatching network-vif-plugged-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:13 user nova-compute[71205]: WARNING nova.compute.manager [req-3991a5ec-9bb3-4ba8-8466-58fcc5f19574 req-1ab1a88e-d85b-4fdf-a8cb-eafcc59e7439 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Received unexpected event network-vif-plugged-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 for instance with vm_state active and task_state deleting. Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG nova.compute.manager [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received event network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] Acquiring lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG nova.compute.manager [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] No waiting events found dispatching network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:14 user nova-compute[71205]: WARNING nova.compute.manager [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received unexpected event network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 for instance with vm_state active and task_state deleting. Apr 24 00:13:14 user nova-compute[71205]: DEBUG nova.compute.manager [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received event network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] Acquiring lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG nova.compute.manager [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] No waiting events found dispatching network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:14 user nova-compute[71205]: WARNING nova.compute.manager [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received unexpected event network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 for instance with vm_state active and task_state deleting. Apr 24 00:13:14 user nova-compute[71205]: DEBUG nova.compute.manager [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received event network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] Acquiring lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG nova.compute.manager [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] No waiting events found dispatching network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:14 user nova-compute[71205]: WARNING nova.compute.manager [req-b055a66d-c1c8-4266-a077-59a386acc47f req-53897808-8bc3-4730-9cbf-ed2737c1cf0c service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received unexpected event network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 for instance with vm_state active and task_state deleting. Apr 24 00:13:14 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:13:14 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Took 1.47 seconds to deallocate network for instance. Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:14 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Took 1.84 seconds to deallocate network for instance. Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG nova.compute.manager [req-59c8e156-4055-40f6-aa9d-147571cff109 req-6ab44ad4-036b-4560-b5d8-380ac8d1a7e6 service nova] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Received event network-vif-deleted-81bcd0fd-3b1f-4890-b1fb-3086a1ac8a54 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.427s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.054s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:14 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Deleted allocations for instance ffbf17ce-e3cb-4099-bea3-6887fef4476d Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "1821ecf2-8c71-48ad-96da-f63b83439c6d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG nova.compute.manager [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-48430cbe-4d1b-4507-a203-17fd2952518d tempest-DeleteServersTestJSON-1102950130 tempest-DeleteServersTestJSON-1102950130-project-member] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.118s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json" returned: 0 in 0.179s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.414s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.143s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:13:15 user nova-compute[71205]: INFO nova.compute.claims [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Claim successful on node user Apr 24 00:13:15 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Deleted allocations for instance 212a2ad6-77ab-4615-b6ca-f426a3e76ab5 Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-b05f9b0c-b584-43c3-a485-fd3a450d227d tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "212a2ad6-77ab-4615-b6ca-f426a3e76ab5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.808s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:13:15 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.838s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.manager [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:13:16 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:13:16 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=8217MB free_disk=26.625202178955078GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.manager [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.network.neutron [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:13:16 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.manager [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ce19423d-a6ee-4506-9cd1-ec4803abdd86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance e762c863-43e1-4f26-ab6b-c8ea40f08887 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ac38bbc2-2229-4497-b501-e9230ec59a32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance f1a14b79-7792-4962-bbe1-ec11e10e6948 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 1821ecf2-8c71-48ad-96da-f63b83439c6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 8 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=1536MB phys_disk=40GB used_disk=8GB total_vcpus=12 used_vcpus=8 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.policy [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'abae98323deb44dea0622186485cc7af', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ce75f63fc0904eceb03e8319bddba4d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.manager [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:13:16 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Creating image(s) Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "/opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "/opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "/opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.manager [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received event network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] Acquiring lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] Lock "ffbf17ce-e3cb-4099-bea3-6887fef4476d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.manager [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] No waiting events found dispatching network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:16 user nova-compute[71205]: WARNING nova.compute.manager [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received unexpected event network-vif-plugged-be0c060b-e1fe-496e-8827-a2699e8a4017 for instance with vm_state deleted and task_state None. Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.manager [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Received event network-vif-deleted-be0c060b-e1fe-496e-8827-a2699e8a4017 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.manager [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Received event network-changed-d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.manager [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Refreshing instance network info cache due to event network-changed-d90cab57-5c57-410a-a61e-ab316454a676. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] Acquiring lock "refresh_cache-e762c863-43e1-4f26-ab6b-c8ea40f08887" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] Acquired lock "refresh_cache-e762c863-43e1-4f26-ab6b-c8ea40f08887" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.network.neutron [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Refreshing network info cache for port d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.140s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.479s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.142s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/disk 1073741824" returned: 0 in 0.045s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.192s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.141s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Checking if we can resize image /opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.network.neutron [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Successfully created port: 86355003-a71c-4c4f-9536-beab8f09ded2 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Cannot resize image /opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.objects.instance [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lazy-loading 'migration_context' on Instance uuid 1821ecf2-8c71-48ad-96da-f63b83439c6d {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Ensure instance console log exists: /opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG nova.network.neutron [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Updated VIF entry in instance network info cache for port d90cab57-5c57-410a-a61e-ab316454a676. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG nova.network.neutron [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Updating instance_info_cache with network_info: [{"id": "d90cab57-5c57-410a-a61e-ab316454a676", "address": "fa:16:3e:07:f9:7b", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd90cab57-5c", "ovs_interfaceid": "d90cab57-5c57-410a-a61e-ab316454a676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-cd9ff962-30c4-4e80-be31-a2f99d72dc01 req-85687f0d-f09b-4ad2-acc5-404410f5e1f8 service nova] Releasing lock "refresh_cache-e762c863-43e1-4f26-ab6b-c8ea40f08887" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG nova.network.neutron [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Successfully updated port: 86355003-a71c-4c4f-9536-beab8f09ded2 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "refresh_cache-1821ecf2-8c71-48ad-96da-f63b83439c6d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquired lock "refresh_cache-1821ecf2-8c71-48ad-96da-f63b83439c6d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG nova.network.neutron [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG nova.compute.manager [req-58118f99-ec29-4964-a953-34b866c5f9fb req-4af872d0-a0ab-49b7-9481-4e536594d4e2 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Received event network-changed-86355003-a71c-4c4f-9536-beab8f09ded2 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG nova.compute.manager [req-58118f99-ec29-4964-a953-34b866c5f9fb req-4af872d0-a0ab-49b7-9481-4e536594d4e2 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Refreshing instance network info cache due to event network-changed-86355003-a71c-4c4f-9536-beab8f09ded2. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-58118f99-ec29-4964-a953-34b866c5f9fb req-4af872d0-a0ab-49b7-9481-4e536594d4e2 service nova] Acquiring lock "refresh_cache-1821ecf2-8c71-48ad-96da-f63b83439c6d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG nova.network.neutron [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "e762c863-43e1-4f26-ab6b-c8ea40f08887" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:17 user nova-compute[71205]: INFO nova.compute.manager [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Terminating instance Apr 24 00:13:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.network.neutron [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Updating instance_info_cache with network_info: [{"id": "86355003-a71c-4c4f-9536-beab8f09ded2", "address": "fa:16:3e:fb:43:8b", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap86355003-a7", "ovs_interfaceid": "86355003-a71c-4c4f-9536-beab8f09ded2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Releasing lock "refresh_cache-1821ecf2-8c71-48ad-96da-f63b83439c6d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Instance network_info: |[{"id": "86355003-a71c-4c4f-9536-beab8f09ded2", "address": "fa:16:3e:fb:43:8b", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap86355003-a7", "ovs_interfaceid": "86355003-a71c-4c4f-9536-beab8f09ded2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-58118f99-ec29-4964-a953-34b866c5f9fb req-4af872d0-a0ab-49b7-9481-4e536594d4e2 service nova] Acquired lock "refresh_cache-1821ecf2-8c71-48ad-96da-f63b83439c6d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.network.neutron [req-58118f99-ec29-4964-a953-34b866c5f9fb req-4af872d0-a0ab-49b7-9481-4e536594d4e2 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Refreshing network info cache for port 86355003-a71c-4c4f-9536-beab8f09ded2 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Start _get_guest_xml network_info=[{"id": "86355003-a71c-4c4f-9536-beab8f09ded2", "address": "fa:16:3e:fb:43:8b", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap86355003-a7", "ovs_interfaceid": "86355003-a71c-4c4f-9536-beab8f09ded2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:13:18 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:13:18 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:13:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1213468308',display_name='tempest-ServersNegativeTestJSON-server-1213468308',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1213468308',id=11,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce75f63fc0904eceb03e8319bddba4d3',ramdisk_id='',reservation_id='r-o0umdweg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-380105770',owner_user_name='tempest-ServersNegativeTestJSON-380105770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:13:16Z,user_data=None,user_id='abae98323deb44dea0622186485cc7af',uuid=1821ecf2-8c71-48ad-96da-f63b83439c6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "86355003-a71c-4c4f-9536-beab8f09ded2", "address": "fa:16:3e:fb:43:8b", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap86355003-a7", "ovs_interfaceid": "86355003-a71c-4c4f-9536-beab8f09ded2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Converting VIF {"id": "86355003-a71c-4c4f-9536-beab8f09ded2", "address": "fa:16:3e:fb:43:8b", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap86355003-a7", "ovs_interfaceid": "86355003-a71c-4c4f-9536-beab8f09ded2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:43:8b,bridge_name='br-int',has_traffic_filtering=True,id=86355003-a71c-4c4f-9536-beab8f09ded2,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86355003-a7') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.objects.instance [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lazy-loading 'pci_devices' on Instance uuid 1821ecf2-8c71-48ad-96da-f63b83439c6d {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] End _get_guest_xml xml= Apr 24 00:13:18 user nova-compute[71205]: 1821ecf2-8c71-48ad-96da-f63b83439c6d Apr 24 00:13:18 user nova-compute[71205]: instance-0000000b Apr 24 00:13:18 user nova-compute[71205]: 131072 Apr 24 00:13:18 user nova-compute[71205]: 1 Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: tempest-ServersNegativeTestJSON-server-1213468308 Apr 24 00:13:18 user nova-compute[71205]: 2023-04-24 00:13:18 Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: 128 Apr 24 00:13:18 user nova-compute[71205]: 1 Apr 24 00:13:18 user nova-compute[71205]: 0 Apr 24 00:13:18 user nova-compute[71205]: 0 Apr 24 00:13:18 user nova-compute[71205]: 1 Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: tempest-ServersNegativeTestJSON-380105770-project-member Apr 24 00:13:18 user nova-compute[71205]: tempest-ServersNegativeTestJSON-380105770 Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: OpenStack Foundation Apr 24 00:13:18 user nova-compute[71205]: OpenStack Nova Apr 24 00:13:18 user nova-compute[71205]: 0.0.0 Apr 24 00:13:18 user nova-compute[71205]: 1821ecf2-8c71-48ad-96da-f63b83439c6d Apr 24 00:13:18 user nova-compute[71205]: 1821ecf2-8c71-48ad-96da-f63b83439c6d Apr 24 00:13:18 user nova-compute[71205]: Virtual Machine Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: hvm Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Nehalem Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: /dev/urandom Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: Apr 24 00:13:18 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:13:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1213468308',display_name='tempest-ServersNegativeTestJSON-server-1213468308',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1213468308',id=11,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ce75f63fc0904eceb03e8319bddba4d3',ramdisk_id='',reservation_id='r-o0umdweg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-380105770',owner_user_name='tempest-ServersNegativeTestJSON-380105770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:13:16Z,user_data=None,user_id='abae98323deb44dea0622186485cc7af',uuid=1821ecf2-8c71-48ad-96da-f63b83439c6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "86355003-a71c-4c4f-9536-beab8f09ded2", "address": "fa:16:3e:fb:43:8b", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap86355003-a7", "ovs_interfaceid": "86355003-a71c-4c4f-9536-beab8f09ded2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Converting VIF {"id": "86355003-a71c-4c4f-9536-beab8f09ded2", "address": "fa:16:3e:fb:43:8b", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap86355003-a7", "ovs_interfaceid": "86355003-a71c-4c4f-9536-beab8f09ded2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:43:8b,bridge_name='br-int',has_traffic_filtering=True,id=86355003-a71c-4c4f-9536-beab8f09ded2,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86355003-a7') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG os_vif [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:43:8b,bridge_name='br-int',has_traffic_filtering=True,id=86355003-a71c-4c4f-9536-beab8f09ded2,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86355003-a7') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap86355003-a7, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap86355003-a7, col_values=(('external_ids', {'iface-id': '86355003-a71c-4c4f-9536-beab8f09ded2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fb:43:8b', 'vm-uuid': '1821ecf2-8c71-48ad-96da-f63b83439c6d'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:18 user nova-compute[71205]: INFO os_vif [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:43:8b,bridge_name='br-int',has_traffic_filtering=True,id=86355003-a71c-4c4f-9536-beab8f09ded2,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86355003-a7') Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] No VIF found with MAC fa:16:3e:fb:43:8b, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.compute.manager [req-3724c3cf-1abb-43b8-99d2-7f7b63f0bd3b req-7597c7a8-af68-42f9-ae1d-871a58539b26 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Received event network-vif-unplugged-d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3724c3cf-1abb-43b8-99d2-7f7b63f0bd3b req-7597c7a8-af68-42f9-ae1d-871a58539b26 service nova] Acquiring lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3724c3cf-1abb-43b8-99d2-7f7b63f0bd3b req-7597c7a8-af68-42f9-ae1d-871a58539b26 service nova] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3724c3cf-1abb-43b8-99d2-7f7b63f0bd3b req-7597c7a8-af68-42f9-ae1d-871a58539b26 service nova] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.compute.manager [req-3724c3cf-1abb-43b8-99d2-7f7b63f0bd3b req-7597c7a8-af68-42f9-ae1d-871a58539b26 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] No waiting events found dispatching network-vif-unplugged-d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.compute.manager [req-3724c3cf-1abb-43b8-99d2-7f7b63f0bd3b req-7597c7a8-af68-42f9-ae1d-871a58539b26 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Received event network-vif-unplugged-d90cab57-5c57-410a-a61e-ab316454a676 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.compute.manager [req-3724c3cf-1abb-43b8-99d2-7f7b63f0bd3b req-7597c7a8-af68-42f9-ae1d-871a58539b26 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Received event network-vif-plugged-d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3724c3cf-1abb-43b8-99d2-7f7b63f0bd3b req-7597c7a8-af68-42f9-ae1d-871a58539b26 service nova] Acquiring lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3724c3cf-1abb-43b8-99d2-7f7b63f0bd3b req-7597c7a8-af68-42f9-ae1d-871a58539b26 service nova] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3724c3cf-1abb-43b8-99d2-7f7b63f0bd3b req-7597c7a8-af68-42f9-ae1d-871a58539b26 service nova] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.compute.manager [req-3724c3cf-1abb-43b8-99d2-7f7b63f0bd3b req-7597c7a8-af68-42f9-ae1d-871a58539b26 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] No waiting events found dispatching network-vif-plugged-d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:18 user nova-compute[71205]: WARNING nova.compute.manager [req-3724c3cf-1abb-43b8-99d2-7f7b63f0bd3b req-7597c7a8-af68-42f9-ae1d-871a58539b26 service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Received unexpected event network-vif-plugged-d90cab57-5c57-410a-a61e-ab316454a676 for instance with vm_state active and task_state deleting. Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.network.neutron [req-58118f99-ec29-4964-a953-34b866c5f9fb req-4af872d0-a0ab-49b7-9481-4e536594d4e2 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Updated VIF entry in instance network info cache for port 86355003-a71c-4c4f-9536-beab8f09ded2. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.network.neutron [req-58118f99-ec29-4964-a953-34b866c5f9fb req-4af872d0-a0ab-49b7-9481-4e536594d4e2 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Updating instance_info_cache with network_info: [{"id": "86355003-a71c-4c4f-9536-beab8f09ded2", "address": "fa:16:3e:fb:43:8b", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap86355003-a7", "ovs_interfaceid": "86355003-a71c-4c4f-9536-beab8f09ded2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-58118f99-ec29-4964-a953-34b866c5f9fb req-4af872d0-a0ab-49b7-9481-4e536594d4e2 service nova] Releasing lock "refresh_cache-1821ecf2-8c71-48ad-96da-f63b83439c6d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:13:18 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Instance destroyed successfully. Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.objects.instance [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lazy-loading 'resources' on Instance uuid e762c863-43e1-4f26-ab6b-c8ea40f08887 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-952005257',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-952005257',id=5,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH/KzlzjtqauDBrnzunF6cUjlci7dQJPS3JVA4rtGDgOFyp/2HY51uowvViM2HhE4lJZuUeHsvbKt1HzFUPo9qtzdZ2uB0Xr7EHNKxzFBZkrBwvigyr3VT3tKQN0UGAShA==',key_name='tempest-keypair-573827894',keypairs=,launch_index=0,launched_at=2023-04-24T00:11:37Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='e2bf3154181247f8963be8cd31399851',ramdisk_id='',reservation_id='r-oy4uh9un',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1947115496',owner_user_name='tempest-AttachVolumeShelveTestJSON-1947115496-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:11:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='539997e65f4f4ef7998a4386d19a5e9f',uuid=e762c863-43e1-4f26-ab6b-c8ea40f08887,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d90cab57-5c57-410a-a61e-ab316454a676", "address": "fa:16:3e:07:f9:7b", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd90cab57-5c", "ovs_interfaceid": "d90cab57-5c57-410a-a61e-ab316454a676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Converting VIF {"id": "d90cab57-5c57-410a-a61e-ab316454a676", "address": "fa:16:3e:07:f9:7b", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd90cab57-5c", "ovs_interfaceid": "d90cab57-5c57-410a-a61e-ab316454a676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:07:f9:7b,bridge_name='br-int',has_traffic_filtering=True,id=d90cab57-5c57-410a-a61e-ab316454a676,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd90cab57-5c') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG os_vif [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:07:f9:7b,bridge_name='br-int',has_traffic_filtering=True,id=d90cab57-5c57-410a-a61e-ab316454a676,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd90cab57-5c') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd90cab57-5c, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:18 user nova-compute[71205]: INFO os_vif [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:07:f9:7b,bridge_name='br-int',has_traffic_filtering=True,id=d90cab57-5c57-410a-a61e-ab316454a676,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd90cab57-5c') Apr 24 00:13:18 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Deleting instance files /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887_del Apr 24 00:13:18 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Deletion of /opt/stack/data/nova/instances/e762c863-43e1-4f26-ab6b-c8ea40f08887_del complete Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:13:18 user nova-compute[71205]: INFO nova.compute.manager [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] instance snapshotting Apr 24 00:13:18 user nova-compute[71205]: INFO nova.compute.manager [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Took 1.15 seconds to destroy the instance on the hypervisor. Apr 24 00:13:18 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:13:18 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:13:18 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Beginning live snapshot process Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json -f qcow2 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json -f qcow2" returned: 0 in 0.133s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json -f qcow2 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json -f qcow2" returned: 0 in 0.128s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.130s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpiiibiuxp/6234224455ab425db62aed8bf2fde566.delta 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpiiibiuxp/6234224455ab425db62aed8bf2fde566.delta 1073741824" returned: 0 in 0.051s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:19 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Quiescing instance not available: QEMU guest agent is not enabled. Apr 24 00:13:19 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:13:19 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Took 0.70 seconds to deallocate network for instance. Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-06381364-4611-480d-8a8f-7f3dc95d4c5e req-1fff58b6-2aeb-45e2-b1c8-810d260c526e service nova] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Received event network-vif-deleted-d90cab57-5c57-410a-a61e-ab316454a676 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.291s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:19 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Deleted allocations for instance e762c863-43e1-4f26-ab6b-c8ea40f08887 Apr 24 00:13:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-97884ac8-3841-4506-9d5b-0ce507d769b6 req-314f97d1-a3d8-4ee0-87b2-3310497ac486 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Received event network-vif-plugged-86355003-a71c-4c4f-9536-beab8f09ded2 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-97884ac8-3841-4506-9d5b-0ce507d769b6 req-314f97d1-a3d8-4ee0-87b2-3310497ac486 service nova] Acquiring lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-97884ac8-3841-4506-9d5b-0ce507d769b6 req-314f97d1-a3d8-4ee0-87b2-3310497ac486 service nova] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-97884ac8-3841-4506-9d5b-0ce507d769b6 req-314f97d1-a3d8-4ee0-87b2-3310497ac486 service nova] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-97884ac8-3841-4506-9d5b-0ce507d769b6 req-314f97d1-a3d8-4ee0-87b2-3310497ac486 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] No waiting events found dispatching network-vif-plugged-86355003-a71c-4c4f-9536-beab8f09ded2 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:19 user nova-compute[71205]: WARNING nova.compute.manager [req-97884ac8-3841-4506-9d5b-0ce507d769b6 req-314f97d1-a3d8-4ee0-87b2-3310497ac486 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Received unexpected event network-vif-plugged-86355003-a71c-4c4f-9536-beab8f09ded2 for instance with vm_state building and task_state spawning. Apr 24 00:13:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-db748f70-29ef-4bf2-9d83-549f8e74e627 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "e762c863-43e1-4f26-ab6b-c8ea40f08887" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.314s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:20 user nova-compute[71205]: DEBUG nova.virt.libvirt.guest [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71205) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 24 00:13:20 user nova-compute[71205]: DEBUG nova.virt.libvirt.guest [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71205) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 24 00:13:20 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 24 00:13:20 user nova-compute[71205]: DEBUG nova.privsep.utils [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71205) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 24 00:13:20 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpiiibiuxp/6234224455ab425db62aed8bf2fde566.delta /opt/stack/data/nova/instances/snapshots/tmpiiibiuxp/6234224455ab425db62aed8bf2fde566 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:21 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpiiibiuxp/6234224455ab425db62aed8bf2fde566.delta /opt/stack/data/nova/instances/snapshots/tmpiiibiuxp/6234224455ab425db62aed8bf2fde566" returned: 0 in 0.447s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:21 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Snapshot extracted, beginning image upload Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.compute.manager [req-7e84607a-21f4-4c3a-b8f4-4ad1eb075174 req-041f8a99-9b17-4d1f-9671-ff374c3f3329 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Received event network-vif-plugged-86355003-a71c-4c4f-9536-beab8f09ded2 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-7e84607a-21f4-4c3a-b8f4-4ad1eb075174 req-041f8a99-9b17-4d1f-9671-ff374c3f3329 service nova] Acquiring lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-7e84607a-21f4-4c3a-b8f4-4ad1eb075174 req-041f8a99-9b17-4d1f-9671-ff374c3f3329 service nova] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-7e84607a-21f4-4c3a-b8f4-4ad1eb075174 req-041f8a99-9b17-4d1f-9671-ff374c3f3329 service nova] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.compute.manager [req-7e84607a-21f4-4c3a-b8f4-4ad1eb075174 req-041f8a99-9b17-4d1f-9671-ff374c3f3329 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] No waiting events found dispatching network-vif-plugged-86355003-a71c-4c4f-9536-beab8f09ded2 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:22 user nova-compute[71205]: WARNING nova.compute.manager [req-7e84607a-21f4-4c3a-b8f4-4ad1eb075174 req-041f8a99-9b17-4d1f-9671-ff374c3f3329 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Received unexpected event network-vif-plugged-86355003-a71c-4c4f-9536-beab8f09ded2 for instance with vm_state building and task_state spawning. Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:13:22 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] VM Resumed (Lifecycle Event) Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:13:22 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Instance spawned successfully. Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:13:22 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:13:22 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] VM Started (Lifecycle Event) Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:13:22 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:13:22 user nova-compute[71205]: INFO nova.compute.manager [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Took 6.23 seconds to spawn the instance on the hypervisor. Apr 24 00:13:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:13:22 user nova-compute[71205]: INFO nova.compute.manager [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Took 7.55 seconds to build instance. Apr 24 00:13:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-bac3d3f4-0452-4c2e-bbcc-20783774ebdc tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.671s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:23 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Snapshot image upload complete Apr 24 00:13:23 user nova-compute[71205]: INFO nova.compute.manager [None req-66214f98-5ae2-4a17-870b-ecf3c18cfc93 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Took 4.84 seconds to snapshot the instance on the hypervisor. Apr 24 00:13:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:27 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:13:27 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] VM Stopped (Lifecycle Event) Apr 24 00:13:27 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:13:27 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] VM Stopped (Lifecycle Event) Apr 24 00:13:27 user nova-compute[71205]: DEBUG nova.compute.manager [None req-960a9a19-6b4a-47a5-b4cd-d068db197213 None None] [instance: 212a2ad6-77ab-4615-b6ca-f426a3e76ab5] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:13:27 user nova-compute[71205]: DEBUG nova.compute.manager [None req-0ef6a7ca-e9ce-439c-8647-a30043495f4e None None] [instance: ffbf17ce-e3cb-4099-bea3-6887fef4476d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:13:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:13:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:13:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:13:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:13:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:33 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:13:33 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] VM Stopped (Lifecycle Event) Apr 24 00:13:33 user nova-compute[71205]: DEBUG nova.compute.manager [None req-83561bd5-b37a-4f6c-88b5-724c808549d1 None None] [instance: e762c863-43e1-4f26-ab6b-c8ea40f08887] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:13:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:38 user nova-compute[71205]: DEBUG nova.compute.manager [req-369bb842-571b-4775-aecd-a782e2190f44 req-6a5be319-cfb3-42bd-928d-bf9cf8ce2f83 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Received event network-changed-d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:38 user nova-compute[71205]: DEBUG nova.compute.manager [req-369bb842-571b-4775-aecd-a782e2190f44 req-6a5be319-cfb3-42bd-928d-bf9cf8ce2f83 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Refreshing instance network info cache due to event network-changed-d6f98d8d-f918-4aa5-abb0-a34e782f890a. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:13:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-369bb842-571b-4775-aecd-a782e2190f44 req-6a5be319-cfb3-42bd-928d-bf9cf8ce2f83 service nova] Acquiring lock "refresh_cache-dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:13:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-369bb842-571b-4775-aecd-a782e2190f44 req-6a5be319-cfb3-42bd-928d-bf9cf8ce2f83 service nova] Acquired lock "refresh_cache-dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:13:38 user nova-compute[71205]: DEBUG nova.network.neutron [req-369bb842-571b-4775-aecd-a782e2190f44 req-6a5be319-cfb3-42bd-928d-bf9cf8ce2f83 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Refreshing network info cache for port d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:13:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:38 user nova-compute[71205]: DEBUG nova.network.neutron [req-369bb842-571b-4775-aecd-a782e2190f44 req-6a5be319-cfb3-42bd-928d-bf9cf8ce2f83 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Updated VIF entry in instance network info cache for port d6f98d8d-f918-4aa5-abb0-a34e782f890a. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:13:38 user nova-compute[71205]: DEBUG nova.network.neutron [req-369bb842-571b-4775-aecd-a782e2190f44 req-6a5be319-cfb3-42bd-928d-bf9cf8ce2f83 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Updating instance_info_cache with network_info: [{"id": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "address": "fa:16:3e:b2:2c:52", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.15", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6f98d8d-f9", "ovs_interfaceid": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:13:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-369bb842-571b-4775-aecd-a782e2190f44 req-6a5be319-cfb3-42bd-928d-bf9cf8ce2f83 service nova] Releasing lock "refresh_cache-dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:13:40 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:40 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.004s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:40 user nova-compute[71205]: DEBUG nova.compute.manager [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:13:41 user nova-compute[71205]: INFO nova.compute.claims [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Claim successful on node user Apr 24 00:13:41 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.358s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG nova.compute.manager [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG nova.compute.manager [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG nova.network.neutron [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:13:41 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:13:41 user nova-compute[71205]: DEBUG nova.compute.manager [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG nova.policy [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '640ec20e46a2422a8aabcc152e522e02', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'df0187dbb10d42da941645107df203f6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG nova.compute.manager [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:13:41 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Creating image(s) Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "/opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "/opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "/opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.129s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.132s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk 1073741824" returned: 0 in 0.050s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.190s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:41 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.138s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG nova.network.neutron [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Successfully created port: 5b300cf4-3c09-440b-8992-08ebf1a3d958 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Cannot resize image /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG nova.objects.instance [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lazy-loading 'migration_context' on Instance uuid f2a3766c-0a08-4eb5-a833-e39eb73d3426 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Ensure instance console log exists: /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG nova.network.neutron [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Successfully updated port: 5b300cf4-3c09-440b-8992-08ebf1a3d958 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "refresh_cache-f2a3766c-0a08-4eb5-a833-e39eb73d3426" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquired lock "refresh_cache-f2a3766c-0a08-4eb5-a833-e39eb73d3426" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG nova.network.neutron [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG nova.compute.manager [req-5f502d6c-b64c-47b2-9d6e-d0769e2ade80 req-de538814-8a89-4423-9c19-a4d8ba2bff1d service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Received event network-changed-5b300cf4-3c09-440b-8992-08ebf1a3d958 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG nova.compute.manager [req-5f502d6c-b64c-47b2-9d6e-d0769e2ade80 req-de538814-8a89-4423-9c19-a4d8ba2bff1d service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Refreshing instance network info cache due to event network-changed-5b300cf4-3c09-440b-8992-08ebf1a3d958. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-5f502d6c-b64c-47b2-9d6e-d0769e2ade80 req-de538814-8a89-4423-9c19-a4d8ba2bff1d service nova] Acquiring lock "refresh_cache-f2a3766c-0a08-4eb5-a833-e39eb73d3426" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:13:42 user nova-compute[71205]: DEBUG nova.network.neutron [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.network.neutron [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Updating instance_info_cache with network_info: [{"id": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "address": "fa:16:3e:e3:2d:d1", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b300cf4-3c", "ovs_interfaceid": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Releasing lock "refresh_cache-f2a3766c-0a08-4eb5-a833-e39eb73d3426" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.compute.manager [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Instance network_info: |[{"id": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "address": "fa:16:3e:e3:2d:d1", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b300cf4-3c", "ovs_interfaceid": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-5f502d6c-b64c-47b2-9d6e-d0769e2ade80 req-de538814-8a89-4423-9c19-a4d8ba2bff1d service nova] Acquired lock "refresh_cache-f2a3766c-0a08-4eb5-a833-e39eb73d3426" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.network.neutron [req-5f502d6c-b64c-47b2-9d6e-d0769e2ade80 req-de538814-8a89-4423-9c19-a4d8ba2bff1d service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Refreshing network info cache for port 5b300cf4-3c09-440b-8992-08ebf1a3d958 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Start _get_guest_xml network_info=[{"id": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "address": "fa:16:3e:e3:2d:d1", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b300cf4-3c", "ovs_interfaceid": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:13:43 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:13:43 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:13:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-228310302',display_name='tempest-VolumesAdminNegativeTest-server-228310302',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-228310302',id=12,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='df0187dbb10d42da941645107df203f6',ramdisk_id='',reservation_id='r-ilhhrvbt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-821594302',owner_user_name='tempest-VolumesAdminNegativeTest-821594302-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:13:41Z,user_data=None,user_id='640ec20e46a2422a8aabcc152e522e02',uuid=f2a3766c-0a08-4eb5-a833-e39eb73d3426,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "address": "fa:16:3e:e3:2d:d1", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b300cf4-3c", "ovs_interfaceid": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Converting VIF {"id": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "address": "fa:16:3e:e3:2d:d1", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b300cf4-3c", "ovs_interfaceid": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2d:d1,bridge_name='br-int',has_traffic_filtering=True,id=5b300cf4-3c09-440b-8992-08ebf1a3d958,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b300cf4-3c') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.objects.instance [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lazy-loading 'pci_devices' on Instance uuid f2a3766c-0a08-4eb5-a833-e39eb73d3426 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] End _get_guest_xml xml= Apr 24 00:13:43 user nova-compute[71205]: f2a3766c-0a08-4eb5-a833-e39eb73d3426 Apr 24 00:13:43 user nova-compute[71205]: instance-0000000c Apr 24 00:13:43 user nova-compute[71205]: 131072 Apr 24 00:13:43 user nova-compute[71205]: 1 Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: tempest-VolumesAdminNegativeTest-server-228310302 Apr 24 00:13:43 user nova-compute[71205]: 2023-04-24 00:13:43 Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: 128 Apr 24 00:13:43 user nova-compute[71205]: 1 Apr 24 00:13:43 user nova-compute[71205]: 0 Apr 24 00:13:43 user nova-compute[71205]: 0 Apr 24 00:13:43 user nova-compute[71205]: 1 Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: tempest-VolumesAdminNegativeTest-821594302-project-member Apr 24 00:13:43 user nova-compute[71205]: tempest-VolumesAdminNegativeTest-821594302 Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: OpenStack Foundation Apr 24 00:13:43 user nova-compute[71205]: OpenStack Nova Apr 24 00:13:43 user nova-compute[71205]: 0.0.0 Apr 24 00:13:43 user nova-compute[71205]: f2a3766c-0a08-4eb5-a833-e39eb73d3426 Apr 24 00:13:43 user nova-compute[71205]: f2a3766c-0a08-4eb5-a833-e39eb73d3426 Apr 24 00:13:43 user nova-compute[71205]: Virtual Machine Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: hvm Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Nehalem Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: /dev/urandom Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: Apr 24 00:13:43 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:13:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-228310302',display_name='tempest-VolumesAdminNegativeTest-server-228310302',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-228310302',id=12,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='df0187dbb10d42da941645107df203f6',ramdisk_id='',reservation_id='r-ilhhrvbt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-821594302',owner_user_name='tempest-VolumesAdminNegativeTest-821594302-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:13:41Z,user_data=None,user_id='640ec20e46a2422a8aabcc152e522e02',uuid=f2a3766c-0a08-4eb5-a833-e39eb73d3426,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "address": "fa:16:3e:e3:2d:d1", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b300cf4-3c", "ovs_interfaceid": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Converting VIF {"id": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "address": "fa:16:3e:e3:2d:d1", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b300cf4-3c", "ovs_interfaceid": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2d:d1,bridge_name='br-int',has_traffic_filtering=True,id=5b300cf4-3c09-440b-8992-08ebf1a3d958,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b300cf4-3c') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG os_vif [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2d:d1,bridge_name='br-int',has_traffic_filtering=True,id=5b300cf4-3c09-440b-8992-08ebf1a3d958,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b300cf4-3c') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5b300cf4-3c, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5b300cf4-3c, col_values=(('external_ids', {'iface-id': '5b300cf4-3c09-440b-8992-08ebf1a3d958', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e3:2d:d1', 'vm-uuid': 'f2a3766c-0a08-4eb5-a833-e39eb73d3426'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:43 user nova-compute[71205]: INFO os_vif [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2d:d1,bridge_name='br-int',has_traffic_filtering=True,id=5b300cf4-3c09-440b-8992-08ebf1a3d958,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b300cf4-3c') Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] No VIF found with MAC fa:16:3e:e3:2d:d1, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.compute.manager [req-81be1604-36c1-4ae8-8772-d72c8888cf25 req-c96c8c88-a14b-4950-96c7-64a5cca4d3fe service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Received event network-changed-5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.compute.manager [req-81be1604-36c1-4ae8-8772-d72c8888cf25 req-c96c8c88-a14b-4950-96c7-64a5cca4d3fe service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Refreshing instance network info cache due to event network-changed-5a56c96a-0083-47e5-819c-1d802bbcd6ea. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-81be1604-36c1-4ae8-8772-d72c8888cf25 req-c96c8c88-a14b-4950-96c7-64a5cca4d3fe service nova] Acquiring lock "refresh_cache-5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-81be1604-36c1-4ae8-8772-d72c8888cf25 req-c96c8c88-a14b-4950-96c7-64a5cca4d3fe service nova] Acquired lock "refresh_cache-5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.network.neutron [req-81be1604-36c1-4ae8-8772-d72c8888cf25 req-c96c8c88-a14b-4950-96c7-64a5cca4d3fe service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Refreshing network info cache for port 5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.network.neutron [req-5f502d6c-b64c-47b2-9d6e-d0769e2ade80 req-de538814-8a89-4423-9c19-a4d8ba2bff1d service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Updated VIF entry in instance network info cache for port 5b300cf4-3c09-440b-8992-08ebf1a3d958. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG nova.network.neutron [req-5f502d6c-b64c-47b2-9d6e-d0769e2ade80 req-de538814-8a89-4423-9c19-a4d8ba2bff1d service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Updating instance_info_cache with network_info: [{"id": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "address": "fa:16:3e:e3:2d:d1", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b300cf4-3c", "ovs_interfaceid": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:13:43 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-5f502d6c-b64c-47b2-9d6e-d0769e2ade80 req-de538814-8a89-4423-9c19-a4d8ba2bff1d service nova] Releasing lock "refresh_cache-f2a3766c-0a08-4eb5-a833-e39eb73d3426" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:13:44 user nova-compute[71205]: DEBUG nova.network.neutron [req-81be1604-36c1-4ae8-8772-d72c8888cf25 req-c96c8c88-a14b-4950-96c7-64a5cca4d3fe service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Updated VIF entry in instance network info cache for port 5a56c96a-0083-47e5-819c-1d802bbcd6ea. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:13:44 user nova-compute[71205]: DEBUG nova.network.neutron [req-81be1604-36c1-4ae8-8772-d72c8888cf25 req-c96c8c88-a14b-4950-96c7-64a5cca4d3fe service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Updating instance_info_cache with network_info: [{"id": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "address": "fa:16:3e:04:8c:34", "network": {"id": "60f29626-4198-42e4-835f-2d0d9cfabf8d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-973896992-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6e163039cafd4b2880a41ded2e2f7d00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a56c96a-00", "ovs_interfaceid": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:13:44 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-81be1604-36c1-4ae8-8772-d72c8888cf25 req-c96c8c88-a14b-4950-96c7-64a5cca4d3fe service nova] Releasing lock "refresh_cache-5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:13:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Acquiring lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Acquiring lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:45 user nova-compute[71205]: INFO nova.compute.manager [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Terminating instance Apr 24 00:13:45 user nova-compute[71205]: DEBUG nova.compute.manager [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG nova.compute.manager [req-77d57db2-0f34-46ec-ba8c-bdfc99babb68 req-cea6014f-8ded-4027-a8c8-b7d148dd19f3 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Received event network-vif-unplugged-5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-77d57db2-0f34-46ec-ba8c-bdfc99babb68 req-cea6014f-8ded-4027-a8c8-b7d148dd19f3 service nova] Acquiring lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-77d57db2-0f34-46ec-ba8c-bdfc99babb68 req-cea6014f-8ded-4027-a8c8-b7d148dd19f3 service nova] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-77d57db2-0f34-46ec-ba8c-bdfc99babb68 req-cea6014f-8ded-4027-a8c8-b7d148dd19f3 service nova] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG nova.compute.manager [req-77d57db2-0f34-46ec-ba8c-bdfc99babb68 req-cea6014f-8ded-4027-a8c8-b7d148dd19f3 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] No waiting events found dispatching network-vif-unplugged-5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG nova.compute.manager [req-77d57db2-0f34-46ec-ba8c-bdfc99babb68 req-cea6014f-8ded-4027-a8c8-b7d148dd19f3 service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Received event network-vif-unplugged-5a56c96a-0083-47e5-819c-1d802bbcd6ea for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG nova.compute.manager [req-328d8f86-b2c5-40e1-ba14-9f3fdf88a4e9 req-561e05cd-5c42-4cc4-a5b5-4ad699e348e3 service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Received event network-vif-plugged-5b300cf4-3c09-440b-8992-08ebf1a3d958 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-328d8f86-b2c5-40e1-ba14-9f3fdf88a4e9 req-561e05cd-5c42-4cc4-a5b5-4ad699e348e3 service nova] Acquiring lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-328d8f86-b2c5-40e1-ba14-9f3fdf88a4e9 req-561e05cd-5c42-4cc4-a5b5-4ad699e348e3 service nova] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-328d8f86-b2c5-40e1-ba14-9f3fdf88a4e9 req-561e05cd-5c42-4cc4-a5b5-4ad699e348e3 service nova] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG nova.compute.manager [req-328d8f86-b2c5-40e1-ba14-9f3fdf88a4e9 req-561e05cd-5c42-4cc4-a5b5-4ad699e348e3 service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] No waiting events found dispatching network-vif-plugged-5b300cf4-3c09-440b-8992-08ebf1a3d958 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:45 user nova-compute[71205]: WARNING nova.compute.manager [req-328d8f86-b2c5-40e1-ba14-9f3fdf88a4e9 req-561e05cd-5c42-4cc4-a5b5-4ad699e348e3 service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Received unexpected event network-vif-plugged-5b300cf4-3c09-440b-8992-08ebf1a3d958 for instance with vm_state building and task_state spawning. Apr 24 00:13:45 user nova-compute[71205]: DEBUG nova.compute.manager [req-328d8f86-b2c5-40e1-ba14-9f3fdf88a4e9 req-561e05cd-5c42-4cc4-a5b5-4ad699e348e3 service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Received event network-vif-plugged-5b300cf4-3c09-440b-8992-08ebf1a3d958 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-328d8f86-b2c5-40e1-ba14-9f3fdf88a4e9 req-561e05cd-5c42-4cc4-a5b5-4ad699e348e3 service nova] Acquiring lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-328d8f86-b2c5-40e1-ba14-9f3fdf88a4e9 req-561e05cd-5c42-4cc4-a5b5-4ad699e348e3 service nova] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-328d8f86-b2c5-40e1-ba14-9f3fdf88a4e9 req-561e05cd-5c42-4cc4-a5b5-4ad699e348e3 service nova] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG nova.compute.manager [req-328d8f86-b2c5-40e1-ba14-9f3fdf88a4e9 req-561e05cd-5c42-4cc4-a5b5-4ad699e348e3 service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] No waiting events found dispatching network-vif-plugged-5b300cf4-3c09-440b-8992-08ebf1a3d958 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:45 user nova-compute[71205]: WARNING nova.compute.manager [req-328d8f86-b2c5-40e1-ba14-9f3fdf88a4e9 req-561e05cd-5c42-4cc4-a5b5-4ad699e348e3 service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Received unexpected event network-vif-plugged-5b300cf4-3c09-440b-8992-08ebf1a3d958 for instance with vm_state building and task_state spawning. Apr 24 00:13:45 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Instance destroyed successfully. Apr 24 00:13:45 user nova-compute[71205]: DEBUG nova.objects.instance [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lazy-loading 'resources' on Instance uuid 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-24T00:11:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1292260913',display_name='tempest-AttachSCSIVolumeTestJSON-server-1292260913',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1292260913',id=8,image_ref='6d0fc2e0-41f4-457d-aa83-7dd6fd114687',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP/vhtnDrSUbhDcnzbj2hTMPnSPvEi9MH767Q1zBuBu7Q388dTCb4/K/XteUMBOM3VV8UZ23HJ9k+WdLhWp/wu0OUK0FLVdByp/HDgDXNkveo1WfqdtyNUzgJX99Y7XXPw==',key_name='tempest-keypair-1524659291',keypairs=,launch_index=0,launched_at=2023-04-24T00:12:00Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='6e163039cafd4b2880a41ded2e2f7d00',ramdisk_id='',reservation_id='r-trgx5ogp',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6d0fc2e0-41f4-457d-aa83-7dd6fd114687',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1166211293',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1166211293-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:12:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='04625ea29ba641fc8342441f61274d4f',uuid=5e7bfc8d-7d4a-42f7-9657-cc65e1364b87,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "address": "fa:16:3e:04:8c:34", "network": {"id": "60f29626-4198-42e4-835f-2d0d9cfabf8d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-973896992-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6e163039cafd4b2880a41ded2e2f7d00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a56c96a-00", "ovs_interfaceid": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Converting VIF {"id": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "address": "fa:16:3e:04:8c:34", "network": {"id": "60f29626-4198-42e4-835f-2d0d9cfabf8d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-973896992-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6e163039cafd4b2880a41ded2e2f7d00", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a56c96a-00", "ovs_interfaceid": "5a56c96a-0083-47e5-819c-1d802bbcd6ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:04:8c:34,bridge_name='br-int',has_traffic_filtering=True,id=5a56c96a-0083-47e5-819c-1d802bbcd6ea,network=Network(60f29626-4198-42e4-835f-2d0d9cfabf8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a56c96a-00') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG os_vif [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:8c:34,bridge_name='br-int',has_traffic_filtering=True,id=5a56c96a-0083-47e5-819c-1d802bbcd6ea,network=Network(60f29626-4198-42e4-835f-2d0d9cfabf8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a56c96a-00') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5a56c96a-00, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:13:45 user nova-compute[71205]: INFO os_vif [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:8c:34,bridge_name='br-int',has_traffic_filtering=True,id=5a56c96a-0083-47e5-819c-1d802bbcd6ea,network=Network(60f29626-4198-42e4-835f-2d0d9cfabf8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a56c96a-00') Apr 24 00:13:45 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Deleting instance files /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87_del Apr 24 00:13:45 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Deletion of /opt/stack/data/nova/instances/5e7bfc8d-7d4a-42f7-9657-cc65e1364b87_del complete Apr 24 00:13:46 user nova-compute[71205]: INFO nova.compute.manager [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Took 0.71 seconds to destroy the instance on the hypervisor. Apr 24 00:13:46 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:13:46 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:13:46 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:13:46 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:13:46 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] VM Resumed (Lifecycle Event) Apr 24 00:13:46 user nova-compute[71205]: DEBUG nova.compute.manager [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:13:46 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:13:46 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Instance spawned successfully. Apr 24 00:13:46 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:13:46 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:13:46 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:13:47 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Took 1.18 seconds to deallocate network for instance. Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:13:47 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:13:47 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] VM Started (Lifecycle Event) Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:47 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:13:47 user nova-compute[71205]: INFO nova.compute.manager [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Took 5.73 seconds to spawn the instance on the hypervisor. Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.compute.manager [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:13:47 user nova-compute[71205]: INFO nova.compute.manager [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Took 6.38 seconds to build instance. Apr 24 00:13:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-c2424c7f-7f0e-4e99-a47b-c0a178ef366f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.493s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.288s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:47 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Deleted allocations for instance 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87 Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.compute.manager [req-93f4177c-8f69-47e2-b6a4-8446eb1c5b55 req-2977d797-00ca-4e7b-a87f-feee50d2685b service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Received event network-vif-plugged-5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-93f4177c-8f69-47e2-b6a4-8446eb1c5b55 req-2977d797-00ca-4e7b-a87f-feee50d2685b service nova] Acquiring lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-93f4177c-8f69-47e2-b6a4-8446eb1c5b55 req-2977d797-00ca-4e7b-a87f-feee50d2685b service nova] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-93f4177c-8f69-47e2-b6a4-8446eb1c5b55 req-2977d797-00ca-4e7b-a87f-feee50d2685b service nova] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.compute.manager [req-93f4177c-8f69-47e2-b6a4-8446eb1c5b55 req-2977d797-00ca-4e7b-a87f-feee50d2685b service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] No waiting events found dispatching network-vif-plugged-5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:13:47 user nova-compute[71205]: WARNING nova.compute.manager [req-93f4177c-8f69-47e2-b6a4-8446eb1c5b55 req-2977d797-00ca-4e7b-a87f-feee50d2685b service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Received unexpected event network-vif-plugged-5a56c96a-0083-47e5-819c-1d802bbcd6ea for instance with vm_state deleted and task_state None. Apr 24 00:13:47 user nova-compute[71205]: DEBUG nova.compute.manager [req-93f4177c-8f69-47e2-b6a4-8446eb1c5b55 req-2977d797-00ca-4e7b-a87f-feee50d2685b service nova] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Received event network-vif-deleted-5a56c96a-0083-47e5-819c-1d802bbcd6ea {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:13:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-368298a1-d90b-4019-8e42-332edd486850 tempest-AttachSCSIVolumeTestJSON-1166211293 tempest-AttachSCSIVolumeTestJSON-1166211293-project-member] Lock "5e7bfc8d-7d4a-42f7-9657-cc65e1364b87" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.393s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:13:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:55 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:13:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:00 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:14:00 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] VM Stopped (Lifecycle Event) Apr 24 00:14:00 user nova-compute[71205]: DEBUG nova.compute.manager [None req-b3bdd6b4-4510-4ba9-9519-45c1b0ea1edd None None] [instance: 5e7bfc8d-7d4a-42f7-9657-cc65e1364b87] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:14:00 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:01 user nova-compute[71205]: INFO nova.compute.manager [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Rescuing Apr 24 00:14:01 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "refresh_cache-f1a14b79-7792-4962-bbe1-ec11e10e6948" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:14:01 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquired lock "refresh_cache-f1a14b79-7792-4962-bbe1-ec11e10e6948" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:14:01 user nova-compute[71205]: DEBUG nova.network.neutron [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:14:02 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:02 user nova-compute[71205]: DEBUG nova.network.neutron [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Updating instance_info_cache with network_info: [{"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:14:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Releasing lock "refresh_cache-f1a14b79-7792-4962-bbe1-ec11e10e6948" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:14:02 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:02 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:02 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:02 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:02 user nova-compute[71205]: DEBUG nova.compute.manager [req-5bea7efb-096a-4e89-84eb-47007586674f req-78742c40-0390-41e9-aeea-fdee6049b7e7 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received event network-vif-unplugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-5bea7efb-096a-4e89-84eb-47007586674f req-78742c40-0390-41e9-aeea-fdee6049b7e7 service nova] Acquiring lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-5bea7efb-096a-4e89-84eb-47007586674f req-78742c40-0390-41e9-aeea-fdee6049b7e7 service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-5bea7efb-096a-4e89-84eb-47007586674f req-78742c40-0390-41e9-aeea-fdee6049b7e7 service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:02 user nova-compute[71205]: DEBUG nova.compute.manager [req-5bea7efb-096a-4e89-84eb-47007586674f req-78742c40-0390-41e9-aeea-fdee6049b7e7 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] No waiting events found dispatching network-vif-unplugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:14:02 user nova-compute[71205]: WARNING nova.compute.manager [req-5bea7efb-096a-4e89-84eb-47007586674f req-78742c40-0390-41e9-aeea-fdee6049b7e7 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received unexpected event network-vif-unplugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 for instance with vm_state active and task_state rescuing. Apr 24 00:14:02 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Instance destroyed successfully. Apr 24 00:14:03 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Attempting rescue Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} {{(pid=71205) rescue /opt/stack/nova/nova/virt/libvirt/driver.py:4289}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Instance directory exists: not creating {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4694}} Apr 24 00:14:03 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Creating image(s) Apr 24 00:14:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "/opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "/opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "/opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.objects.instance [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lazy-loading 'trusted_certs' on Instance uuid f1a14b79-7792-4962-bbe1-ec11e10e6948 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.150s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue" returned: 0 in 0.054s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.210s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.objects.instance [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lazy-loading 'migration_context' on Instance uuid f1a14b79-7792-4962-bbe1-ec11e10e6948 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Start _get_guest_xml network_info=[{"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "vif_mac": "fa:16:3e:33:34:0d"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue={'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.objects.instance [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lazy-loading 'resources' on Instance uuid f1a14b79-7792-4962-bbe1-ec11e10e6948 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.objects.instance [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lazy-loading 'numa_topology' on Instance uuid f1a14b79-7792-4962-bbe1-ec11e10e6948 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:03 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:14:03 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.objects.instance [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lazy-loading 'vcpu_model' on Instance uuid f1a14b79-7792-4962-bbe1-ec11e10e6948 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:12:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-265518045',display_name='tempest-ServerRescueNegativeTestJSON-server-265518045',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-265518045',id=10,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-24T00:12:19Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='5cff0cbf3a5c4a4aadb3399a31adff0d',ramdisk_id='',reservation_id='r-xi2sy34x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-487575741',owner_user_name='tempest-ServerRescueNegativeTestJSON-487575741-project-member'},tags=,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:12:20Z,user_data=None,user_id='514ecffec8034d60ae3c00ecd1ef5c8b',uuid=f1a14b79-7792-4962-bbe1-ec11e10e6948,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "vif_mac": "fa:16:3e:33:34:0d"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converting VIF {"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "vif_mac": "fa:16:3e:33:34:0d"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:33:34:0d,bridge_name='br-int',has_traffic_filtering=True,id=fbd4aa76-4861-41fe-a0dc-5dee747b2517,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbd4aa76-48') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.objects.instance [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lazy-loading 'pci_devices' on Instance uuid f1a14b79-7792-4962-bbe1-ec11e10e6948 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] End _get_guest_xml xml= Apr 24 00:14:03 user nova-compute[71205]: f1a14b79-7792-4962-bbe1-ec11e10e6948 Apr 24 00:14:03 user nova-compute[71205]: instance-0000000a Apr 24 00:14:03 user nova-compute[71205]: 131072 Apr 24 00:14:03 user nova-compute[71205]: 1 Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: tempest-ServerRescueNegativeTestJSON-server-265518045 Apr 24 00:14:03 user nova-compute[71205]: 2023-04-24 00:14:03 Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: 128 Apr 24 00:14:03 user nova-compute[71205]: 1 Apr 24 00:14:03 user nova-compute[71205]: 0 Apr 24 00:14:03 user nova-compute[71205]: 0 Apr 24 00:14:03 user nova-compute[71205]: 1 Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: tempest-ServerRescueNegativeTestJSON-487575741-project-member Apr 24 00:14:03 user nova-compute[71205]: tempest-ServerRescueNegativeTestJSON-487575741 Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: OpenStack Foundation Apr 24 00:14:03 user nova-compute[71205]: OpenStack Nova Apr 24 00:14:03 user nova-compute[71205]: 0.0.0 Apr 24 00:14:03 user nova-compute[71205]: f1a14b79-7792-4962-bbe1-ec11e10e6948 Apr 24 00:14:03 user nova-compute[71205]: f1a14b79-7792-4962-bbe1-ec11e10e6948 Apr 24 00:14:03 user nova-compute[71205]: Virtual Machine Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: hvm Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Nehalem Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: /dev/urandom Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: Apr 24 00:14:03 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:14:03 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Instance destroyed successfully. Apr 24 00:14:03 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] No BDM found with device name vdb, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:14:03 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] No VIF found with MAC fa:16:3e:33:34:0d, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:14:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:04 user nova-compute[71205]: DEBUG nova.compute.manager [req-8752fc2f-9518-4ac9-9096-870cd8694cd5 req-31d59a52-956c-4554-90eb-584e34126bdd service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received event network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-8752fc2f-9518-4ac9-9096-870cd8694cd5 req-31d59a52-956c-4554-90eb-584e34126bdd service nova] Acquiring lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-8752fc2f-9518-4ac9-9096-870cd8694cd5 req-31d59a52-956c-4554-90eb-584e34126bdd service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-8752fc2f-9518-4ac9-9096-870cd8694cd5 req-31d59a52-956c-4554-90eb-584e34126bdd service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:04 user nova-compute[71205]: DEBUG nova.compute.manager [req-8752fc2f-9518-4ac9-9096-870cd8694cd5 req-31d59a52-956c-4554-90eb-584e34126bdd service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] No waiting events found dispatching network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:14:04 user nova-compute[71205]: WARNING nova.compute.manager [req-8752fc2f-9518-4ac9-9096-870cd8694cd5 req-31d59a52-956c-4554-90eb-584e34126bdd service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received unexpected event network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 for instance with vm_state active and task_state rescuing. Apr 24 00:14:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "38c59ffc-494a-4d2b-a199-226a6e7cc683" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:14:06 user nova-compute[71205]: INFO nova.compute.claims [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Claim successful on node user Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.363s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:14:06 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.policy [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '471d341199f0431a95ae54651c4f0780', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d063c2bdc884fb8b826b9fb6fd97405', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:14:06 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Creating image(s) Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "/opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "/opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "/opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.compute.manager [req-56f3c358-e6a4-4582-a085-22ab346b1126 req-a3ee9703-cc90-45d6-aea2-7559b70be608 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received event network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-56f3c358-e6a4-4582-a085-22ab346b1126 req-a3ee9703-cc90-45d6-aea2-7559b70be608 service nova] Acquiring lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-56f3c358-e6a4-4582-a085-22ab346b1126 req-a3ee9703-cc90-45d6-aea2-7559b70be608 service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-56f3c358-e6a4-4582-a085-22ab346b1126 req-a3ee9703-cc90-45d6-aea2-7559b70be608 service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.compute.manager [req-56f3c358-e6a4-4582-a085-22ab346b1126 req-a3ee9703-cc90-45d6-aea2-7559b70be608 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] No waiting events found dispatching network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:14:06 user nova-compute[71205]: WARNING nova.compute.manager [req-56f3c358-e6a4-4582-a085-22ab346b1126 req-a3ee9703-cc90-45d6-aea2-7559b70be608 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received unexpected event network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 for instance with vm_state active and task_state rescuing. Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.compute.manager [req-56f3c358-e6a4-4582-a085-22ab346b1126 req-a3ee9703-cc90-45d6-aea2-7559b70be608 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received event network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-56f3c358-e6a4-4582-a085-22ab346b1126 req-a3ee9703-cc90-45d6-aea2-7559b70be608 service nova] Acquiring lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-56f3c358-e6a4-4582-a085-22ab346b1126 req-a3ee9703-cc90-45d6-aea2-7559b70be608 service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-56f3c358-e6a4-4582-a085-22ab346b1126 req-a3ee9703-cc90-45d6-aea2-7559b70be608 service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:06 user nova-compute[71205]: DEBUG nova.compute.manager [req-56f3c358-e6a4-4582-a085-22ab346b1126 req-a3ee9703-cc90-45d6-aea2-7559b70be608 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] No waiting events found dispatching network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:14:06 user nova-compute[71205]: WARNING nova.compute.manager [req-56f3c358-e6a4-4582-a085-22ab346b1126 req-a3ee9703-cc90-45d6-aea2-7559b70be608 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received unexpected event network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 for instance with vm_state active and task_state rescuing. Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.149s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.host [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Removed pending event for f1a14b79-7792-4962-bbe1-ec11e10e6948 due to event {{(pid=71205) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:14:07 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] VM Resumed (Lifecycle Event) Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-2966e44c-c257-42ed-97d6-703c862b6f57 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:14:07 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] During sync_power_state the instance has a pending task (rescuing). Skip. Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:14:07 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] VM Started (Lifecycle Event) Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.152s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:07 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] During sync_power_state the instance has a pending task (rescuing). Skip. Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk 1073741824" returned: 0 in 0.056s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.213s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.138s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Checking if we can resize image /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Cannot resize image /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lazy-loading 'migration_context' on Instance uuid 38c59ffc-494a-4d2b-a199-226a6e7cc683 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Successfully created port: e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Ensure instance console log exists: /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Successfully updated port: e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "refresh_cache-38c59ffc-494a-4d2b-a199-226a6e7cc683" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquired lock "refresh_cache-38c59ffc-494a-4d2b-a199-226a6e7cc683" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.compute.manager [req-0a4ca54b-b09e-463b-a448-359691c7cf88 req-c7e41f31-e913-43c1-a1aa-f13d34525e6f service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Received event network-changed-e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.compute.manager [req-0a4ca54b-b09e-463b-a448-359691c7cf88 req-c7e41f31-e913-43c1-a1aa-f13d34525e6f service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Refreshing instance network info cache due to event network-changed-e9b64819-4e5d-4a42-aa9c-d460bb336094. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0a4ca54b-b09e-463b-a448-359691c7cf88 req-c7e41f31-e913-43c1-a1aa-f13d34525e6f service nova] Acquiring lock "refresh_cache-38c59ffc-494a-4d2b-a199-226a6e7cc683" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Updating instance_info_cache with network_info: [{"id": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "address": "fa:16:3e:66:d7:60", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape9b64819-4e", "ovs_interfaceid": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Releasing lock "refresh_cache-38c59ffc-494a-4d2b-a199-226a6e7cc683" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Instance network_info: |[{"id": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "address": "fa:16:3e:66:d7:60", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape9b64819-4e", "ovs_interfaceid": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0a4ca54b-b09e-463b-a448-359691c7cf88 req-c7e41f31-e913-43c1-a1aa-f13d34525e6f service nova] Acquired lock "refresh_cache-38c59ffc-494a-4d2b-a199-226a6e7cc683" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.network.neutron [req-0a4ca54b-b09e-463b-a448-359691c7cf88 req-c7e41f31-e913-43c1-a1aa-f13d34525e6f service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Refreshing network info cache for port e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Start _get_guest_xml network_info=[{"id": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "address": "fa:16:3e:66:d7:60", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape9b64819-4e", "ovs_interfaceid": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:14:08 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:14:08 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:14:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1191061004',display_name='tempest-AttachVolumeTestJSON-server-1191061004',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1191061004',id=13,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKcfTjP5WJcryNzsqlkun8gR9wahBjjV98bivT/lgH0hNMo4q0VTdNxQslRxtaeP2me2yW9f66D9TwOftMRDgzVFqTefGAIW0hAhjgdNDN4d4jy/uJaM5+9CQHsIpBa8tQ==',key_name='tempest-keypair-670582452',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d063c2bdc884fb8b826b9fb6fd97405',ramdisk_id='',reservation_id='r-0x8zxbn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1425553791',owner_user_name='tempest-AttachVolumeTestJSON-1425553791-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:14:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='471d341199f0431a95ae54651c4f0780',uuid=38c59ffc-494a-4d2b-a199-226a6e7cc683,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "address": "fa:16:3e:66:d7:60", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape9b64819-4e", "ovs_interfaceid": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Converting VIF {"id": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "address": "fa:16:3e:66:d7:60", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape9b64819-4e", "ovs_interfaceid": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:d7:60,bridge_name='br-int',has_traffic_filtering=True,id=e9b64819-4e5d-4a42-aa9c-d460bb336094,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9b64819-4e') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lazy-loading 'pci_devices' on Instance uuid 38c59ffc-494a-4d2b-a199-226a6e7cc683 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] End _get_guest_xml xml= Apr 24 00:14:08 user nova-compute[71205]: 38c59ffc-494a-4d2b-a199-226a6e7cc683 Apr 24 00:14:08 user nova-compute[71205]: instance-0000000d Apr 24 00:14:08 user nova-compute[71205]: 131072 Apr 24 00:14:08 user nova-compute[71205]: 1 Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: tempest-AttachVolumeTestJSON-server-1191061004 Apr 24 00:14:08 user nova-compute[71205]: 2023-04-24 00:14:08 Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: 128 Apr 24 00:14:08 user nova-compute[71205]: 1 Apr 24 00:14:08 user nova-compute[71205]: 0 Apr 24 00:14:08 user nova-compute[71205]: 0 Apr 24 00:14:08 user nova-compute[71205]: 1 Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: tempest-AttachVolumeTestJSON-1425553791-project-member Apr 24 00:14:08 user nova-compute[71205]: tempest-AttachVolumeTestJSON-1425553791 Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: OpenStack Foundation Apr 24 00:14:08 user nova-compute[71205]: OpenStack Nova Apr 24 00:14:08 user nova-compute[71205]: 0.0.0 Apr 24 00:14:08 user nova-compute[71205]: 38c59ffc-494a-4d2b-a199-226a6e7cc683 Apr 24 00:14:08 user nova-compute[71205]: 38c59ffc-494a-4d2b-a199-226a6e7cc683 Apr 24 00:14:08 user nova-compute[71205]: Virtual Machine Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: hvm Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Nehalem Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: /dev/urandom Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: Apr 24 00:14:08 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:14:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1191061004',display_name='tempest-AttachVolumeTestJSON-server-1191061004',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1191061004',id=13,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKcfTjP5WJcryNzsqlkun8gR9wahBjjV98bivT/lgH0hNMo4q0VTdNxQslRxtaeP2me2yW9f66D9TwOftMRDgzVFqTefGAIW0hAhjgdNDN4d4jy/uJaM5+9CQHsIpBa8tQ==',key_name='tempest-keypair-670582452',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d063c2bdc884fb8b826b9fb6fd97405',ramdisk_id='',reservation_id='r-0x8zxbn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1425553791',owner_user_name='tempest-AttachVolumeTestJSON-1425553791-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:14:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='471d341199f0431a95ae54651c4f0780',uuid=38c59ffc-494a-4d2b-a199-226a6e7cc683,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "address": "fa:16:3e:66:d7:60", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape9b64819-4e", "ovs_interfaceid": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Converting VIF {"id": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "address": "fa:16:3e:66:d7:60", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape9b64819-4e", "ovs_interfaceid": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:d7:60,bridge_name='br-int',has_traffic_filtering=True,id=e9b64819-4e5d-4a42-aa9c-d460bb336094,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9b64819-4e') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG os_vif [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:d7:60,bridge_name='br-int',has_traffic_filtering=True,id=e9b64819-4e5d-4a42-aa9c-d460bb336094,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9b64819-4e') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape9b64819-4e, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape9b64819-4e, col_values=(('external_ids', {'iface-id': 'e9b64819-4e5d-4a42-aa9c-d460bb336094', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:d7:60', 'vm-uuid': '38c59ffc-494a-4d2b-a199-226a6e7cc683'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:08 user nova-compute[71205]: INFO os_vif [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:d7:60,bridge_name='br-int',has_traffic_filtering=True,id=e9b64819-4e5d-4a42-aa9c-d460bb336094,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9b64819-4e') Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:14:08 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] No VIF found with MAC fa:16:3e:66:d7:60, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:14:09 user nova-compute[71205]: DEBUG nova.network.neutron [req-0a4ca54b-b09e-463b-a448-359691c7cf88 req-c7e41f31-e913-43c1-a1aa-f13d34525e6f service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Updated VIF entry in instance network info cache for port e9b64819-4e5d-4a42-aa9c-d460bb336094. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:14:09 user nova-compute[71205]: DEBUG nova.network.neutron [req-0a4ca54b-b09e-463b-a448-359691c7cf88 req-c7e41f31-e913-43c1-a1aa-f13d34525e6f service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Updating instance_info_cache with network_info: [{"id": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "address": "fa:16:3e:66:d7:60", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape9b64819-4e", "ovs_interfaceid": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:14:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0a4ca54b-b09e-463b-a448-359691c7cf88 req-c7e41f31-e913-43c1-a1aa-f13d34525e6f service nova] Releasing lock "refresh_cache-38c59ffc-494a-4d2b-a199-226a6e7cc683" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:14:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:10 user nova-compute[71205]: DEBUG nova.compute.manager [req-0b2730c9-882d-47ac-8e73-b90e3ca46346 req-64241d1d-b613-4771-8418-ef467a7961ef service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Received event network-vif-plugged-e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0b2730c9-882d-47ac-8e73-b90e3ca46346 req-64241d1d-b613-4771-8418-ef467a7961ef service nova] Acquiring lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0b2730c9-882d-47ac-8e73-b90e3ca46346 req-64241d1d-b613-4771-8418-ef467a7961ef service nova] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0b2730c9-882d-47ac-8e73-b90e3ca46346 req-64241d1d-b613-4771-8418-ef467a7961ef service nova] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:10 user nova-compute[71205]: DEBUG nova.compute.manager [req-0b2730c9-882d-47ac-8e73-b90e3ca46346 req-64241d1d-b613-4771-8418-ef467a7961ef service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] No waiting events found dispatching network-vif-plugged-e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:14:10 user nova-compute[71205]: WARNING nova.compute.manager [req-0b2730c9-882d-47ac-8e73-b90e3ca46346 req-64241d1d-b613-4771-8418-ef467a7961ef service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Received unexpected event network-vif-plugged-e9b64819-4e5d-4a42-aa9c-d460bb336094 for instance with vm_state building and task_state spawning. Apr 24 00:14:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:11 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:14:11 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:14:11 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:14:11 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:14:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:14:12 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] VM Resumed (Lifecycle Event) Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:14:12 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Instance spawned successfully. Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:12 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:14:12 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] VM Started (Lifecycle Event) Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.compute.manager [req-129a2119-86db-45a8-9bc7-7318fa9b3e70 req-ef0aa930-55bd-4818-abd1-e319d1ed0597 service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Received event network-vif-plugged-e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-129a2119-86db-45a8-9bc7-7318fa9b3e70 req-ef0aa930-55bd-4818-abd1-e319d1ed0597 service nova] Acquiring lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-129a2119-86db-45a8-9bc7-7318fa9b3e70 req-ef0aa930-55bd-4818-abd1-e319d1ed0597 service nova] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-129a2119-86db-45a8-9bc7-7318fa9b3e70 req-ef0aa930-55bd-4818-abd1-e319d1ed0597 service nova] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.compute.manager [req-129a2119-86db-45a8-9bc7-7318fa9b3e70 req-ef0aa930-55bd-4818-abd1-e319d1ed0597 service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] No waiting events found dispatching network-vif-plugged-e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:14:12 user nova-compute[71205]: WARNING nova.compute.manager [req-129a2119-86db-45a8-9bc7-7318fa9b3e70 req-ef0aa930-55bd-4818-abd1-e319d1ed0597 service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Received unexpected event network-vif-plugged-e9b64819-4e5d-4a42-aa9c-d460bb336094 for instance with vm_state building and task_state spawning. Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:14:12 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:14:12 user nova-compute[71205]: INFO nova.compute.manager [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Took 5.96 seconds to spawn the instance on the hypervisor. Apr 24 00:14:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:14:12 user nova-compute[71205]: INFO nova.compute.manager [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Took 6.68 seconds to build instance. Apr 24 00:14:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6fd52e5f-5407-478b-8735-204e7d33e9a0 tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.804s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG nova.compute.manager [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:14:13 user nova-compute[71205]: INFO nova.compute.claims [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Claim successful on node user Apr 24 00:14:13 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Updating instance_info_cache with network_info: [{"id": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "address": "fa:16:3e:06:2a:5d", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a3b1d96-2c", "ovs_interfaceid": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.482s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG nova.compute.manager [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.307s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG nova.compute.manager [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:14:13 user nova-compute[71205]: DEBUG nova.network.neutron [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:14:13 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:14:13 user nova-compute[71205]: DEBUG nova.compute.manager [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG nova.compute.manager [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:14:14 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Creating image(s) Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "/opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "/opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "/opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG nova.policy [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '539997e65f4f4ef7998a4386d19a5e9f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e2bf3154181247f8963be8cd31399851', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue --force-share --output=json" returned: 0 in 0.161s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.176s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue --force-share --output=json" returned: 0 in 0.132s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.195s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/disk 1073741824" returned: 0 in 0.068s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.271s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json" returned: 0 in 0.241s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.161s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Checking if we can resize image /opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Cannot resize image /opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG nova.objects.instance [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lazy-loading 'migration_context' on Instance uuid eb48c285-06f8-4d48-a550-c3fb3c05e93a {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Ensure instance console log exists: /opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.159s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.167s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG nova.network.neutron [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Successfully created port: 44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk --force-share --output=json" returned: 0 in 0.175s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk --force-share --output=json" returned: 0 in 0.165s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json" returned: 0 in 0.203s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG nova.network.neutron [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Successfully updated port: 44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG nova.compute.manager [req-acd1c835-dcfc-4043-b01d-e09db53d977b req-41e535f1-c1a5-4595-9cfd-8c5c27ef3b8f service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Received event network-changed-44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG nova.compute.manager [req-acd1c835-dcfc-4043-b01d-e09db53d977b req-41e535f1-c1a5-4595-9cfd-8c5c27ef3b8f service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Refreshing instance network info cache due to event network-changed-44f2910d-d36a-479f-8fa0-f16f9a406765. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-acd1c835-dcfc-4043-b01d-e09db53d977b req-41e535f1-c1a5-4595-9cfd-8c5c27ef3b8f service nova] Acquiring lock "refresh_cache-eb48c285-06f8-4d48-a550-c3fb3c05e93a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-acd1c835-dcfc-4043-b01d-e09db53d977b req-41e535f1-c1a5-4595-9cfd-8c5c27ef3b8f service nova] Acquired lock "refresh_cache-eb48c285-06f8-4d48-a550-c3fb3c05e93a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG nova.network.neutron [req-acd1c835-dcfc-4043-b01d-e09db53d977b req-41e535f1-c1a5-4595-9cfd-8c5c27ef3b8f service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Refreshing network info cache for port 44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "refresh_cache-eb48c285-06f8-4d48-a550-c3fb3c05e93a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG nova.network.neutron [req-acd1c835-dcfc-4043-b01d-e09db53d977b req-41e535f1-c1a5-4595-9cfd-8c5c27ef3b8f service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.network.neutron [req-acd1c835-dcfc-4043-b01d-e09db53d977b req-41e535f1-c1a5-4595-9cfd-8c5c27ef3b8f service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-acd1c835-dcfc-4043-b01d-e09db53d977b req-41e535f1-c1a5-4595-9cfd-8c5c27ef3b8f service nova] Releasing lock "refresh_cache-eb48c285-06f8-4d48-a550-c3fb3c05e93a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquired lock "refresh_cache-eb48c285-06f8-4d48-a550-c3fb3c05e93a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.network.neutron [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.network.neutron [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:14:17 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:14:17 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=8204MB free_disk=26.56546401977539GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ce19423d-a6ee-4506-9cd1-ec4803abdd86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ac38bbc2-2229-4497-b501-e9230ec59a32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance f1a14b79-7792-4962-bbe1-ec11e10e6948 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 1821ecf2-8c71-48ad-96da-f63b83439c6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance f2a3766c-0a08-4eb5-a833-e39eb73d3426 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 38c59ffc-494a-4d2b-a199-226a6e7cc683 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance eb48c285-06f8-4d48-a550-c3fb3c05e93a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 9 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=1664MB phys_disk=40GB used_disk=9GB total_vcpus=12 used_vcpus=9 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:14:17 user nova-compute[71205]: DEBUG nova.network.neutron [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Updating instance_info_cache with network_info: [{"id": "44f2910d-d36a-479f-8fa0-f16f9a406765", "address": "fa:16:3e:ed:01:64", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44f2910d-d3", "ovs_interfaceid": "44f2910d-d36a-479f-8fa0-f16f9a406765", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Releasing lock "refresh_cache-eb48c285-06f8-4d48-a550-c3fb3c05e93a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Instance network_info: |[{"id": "44f2910d-d36a-479f-8fa0-f16f9a406765", "address": "fa:16:3e:ed:01:64", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44f2910d-d3", "ovs_interfaceid": "44f2910d-d36a-479f-8fa0-f16f9a406765", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Start _get_guest_xml network_info=[{"id": "44f2910d-d36a-479f-8fa0-f16f9a406765", "address": "fa:16:3e:ed:01:64", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44f2910d-d3", "ovs_interfaceid": "44f2910d-d36a-479f-8fa0-f16f9a406765", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:14:18 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:14:18 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:14:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-593649378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-593649378',id=14,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH3GEjjTKC06op3WsPivqO+l4BSt54OlT00thV38HKoIy/ZNPvfWSo0jJrcHAlGb/+rJGJfe0UfKW92qrg1FUtDiP9SzVcY4eMX4ApUmWqGlNNLLT473cBCHXS8s2TWXxg==',key_name='tempest-keypair-1058824653',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e2bf3154181247f8963be8cd31399851',ramdisk_id='',reservation_id='r-tpmp2uga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1947115496',owner_user_name='tempest-AttachVolumeShelveTestJSON-1947115496-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:14:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='539997e65f4f4ef7998a4386d19a5e9f',uuid=eb48c285-06f8-4d48-a550-c3fb3c05e93a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44f2910d-d36a-479f-8fa0-f16f9a406765", "address": "fa:16:3e:ed:01:64", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44f2910d-d3", "ovs_interfaceid": "44f2910d-d36a-479f-8fa0-f16f9a406765", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Converting VIF {"id": "44f2910d-d36a-479f-8fa0-f16f9a406765", "address": "fa:16:3e:ed:01:64", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44f2910d-d3", "ovs_interfaceid": "44f2910d-d36a-479f-8fa0-f16f9a406765", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:01:64,bridge_name='br-int',has_traffic_filtering=True,id=44f2910d-d36a-479f-8fa0-f16f9a406765,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44f2910d-d3') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.objects.instance [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lazy-loading 'pci_devices' on Instance uuid eb48c285-06f8-4d48-a550-c3fb3c05e93a {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] End _get_guest_xml xml= Apr 24 00:14:18 user nova-compute[71205]: eb48c285-06f8-4d48-a550-c3fb3c05e93a Apr 24 00:14:18 user nova-compute[71205]: instance-0000000e Apr 24 00:14:18 user nova-compute[71205]: 131072 Apr 24 00:14:18 user nova-compute[71205]: 1 Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: tempest-AttachVolumeShelveTestJSON-server-593649378 Apr 24 00:14:18 user nova-compute[71205]: 2023-04-24 00:14:18 Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: 128 Apr 24 00:14:18 user nova-compute[71205]: 1 Apr 24 00:14:18 user nova-compute[71205]: 0 Apr 24 00:14:18 user nova-compute[71205]: 0 Apr 24 00:14:18 user nova-compute[71205]: 1 Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: tempest-AttachVolumeShelveTestJSON-1947115496-project-member Apr 24 00:14:18 user nova-compute[71205]: tempest-AttachVolumeShelveTestJSON-1947115496 Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: OpenStack Foundation Apr 24 00:14:18 user nova-compute[71205]: OpenStack Nova Apr 24 00:14:18 user nova-compute[71205]: 0.0.0 Apr 24 00:14:18 user nova-compute[71205]: eb48c285-06f8-4d48-a550-c3fb3c05e93a Apr 24 00:14:18 user nova-compute[71205]: eb48c285-06f8-4d48-a550-c3fb3c05e93a Apr 24 00:14:18 user nova-compute[71205]: Virtual Machine Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: hvm Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Nehalem Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: /dev/urandom Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: Apr 24 00:14:18 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:14:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-593649378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-593649378',id=14,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH3GEjjTKC06op3WsPivqO+l4BSt54OlT00thV38HKoIy/ZNPvfWSo0jJrcHAlGb/+rJGJfe0UfKW92qrg1FUtDiP9SzVcY4eMX4ApUmWqGlNNLLT473cBCHXS8s2TWXxg==',key_name='tempest-keypair-1058824653',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e2bf3154181247f8963be8cd31399851',ramdisk_id='',reservation_id='r-tpmp2uga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1947115496',owner_user_name='tempest-AttachVolumeShelveTestJSON-1947115496-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:14:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='539997e65f4f4ef7998a4386d19a5e9f',uuid=eb48c285-06f8-4d48-a550-c3fb3c05e93a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44f2910d-d36a-479f-8fa0-f16f9a406765", "address": "fa:16:3e:ed:01:64", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44f2910d-d3", "ovs_interfaceid": "44f2910d-d36a-479f-8fa0-f16f9a406765", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Converting VIF {"id": "44f2910d-d36a-479f-8fa0-f16f9a406765", "address": "fa:16:3e:ed:01:64", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44f2910d-d3", "ovs_interfaceid": "44f2910d-d36a-479f-8fa0-f16f9a406765", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:01:64,bridge_name='br-int',has_traffic_filtering=True,id=44f2910d-d36a-479f-8fa0-f16f9a406765,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44f2910d-d3') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG os_vif [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:01:64,bridge_name='br-int',has_traffic_filtering=True,id=44f2910d-d36a-479f-8fa0-f16f9a406765,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44f2910d-d3') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44f2910d-d3, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap44f2910d-d3, col_values=(('external_ids', {'iface-id': '44f2910d-d36a-479f-8fa0-f16f9a406765', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ed:01:64', 'vm-uuid': 'eb48c285-06f8-4d48-a550-c3fb3c05e93a'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:18 user nova-compute[71205]: INFO os_vif [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:01:64,bridge_name='br-int',has_traffic_filtering=True,id=44f2910d-d36a-479f-8fa0-f16f9a406765,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44f2910d-d3') Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] No VIF found with MAC fa:16:3e:ed:01:64, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.697s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-9de6f21d-db25-429e-8c65-b6fed84af580 req-21fb96d2-a52d-4727-bb20-745ad1e7f686 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Received event network-vif-plugged-44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-9de6f21d-db25-429e-8c65-b6fed84af580 req-21fb96d2-a52d-4727-bb20-745ad1e7f686 service nova] Acquiring lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-9de6f21d-db25-429e-8c65-b6fed84af580 req-21fb96d2-a52d-4727-bb20-745ad1e7f686 service nova] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-9de6f21d-db25-429e-8c65-b6fed84af580 req-21fb96d2-a52d-4727-bb20-745ad1e7f686 service nova] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-9de6f21d-db25-429e-8c65-b6fed84af580 req-21fb96d2-a52d-4727-bb20-745ad1e7f686 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] No waiting events found dispatching network-vif-plugged-44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:14:19 user nova-compute[71205]: WARNING nova.compute.manager [req-9de6f21d-db25-429e-8c65-b6fed84af580 req-21fb96d2-a52d-4727-bb20-745ad1e7f686 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Received unexpected event network-vif-plugged-44f2910d-d36a-479f-8fa0-f16f9a406765 for instance with vm_state building and task_state spawning. Apr 24 00:14:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:21 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:14:21 user nova-compute[71205]: DEBUG nova.compute.manager [req-17fbb5c6-5565-4571-a8b0-ed72807b279f req-666654ee-a652-49f3-a65d-d132a535a7b6 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Received event network-vif-plugged-44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-17fbb5c6-5565-4571-a8b0-ed72807b279f req-666654ee-a652-49f3-a65d-d132a535a7b6 service nova] Acquiring lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-17fbb5c6-5565-4571-a8b0-ed72807b279f req-666654ee-a652-49f3-a65d-d132a535a7b6 service nova] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-17fbb5c6-5565-4571-a8b0-ed72807b279f req-666654ee-a652-49f3-a65d-d132a535a7b6 service nova] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:21 user nova-compute[71205]: DEBUG nova.compute.manager [req-17fbb5c6-5565-4571-a8b0-ed72807b279f req-666654ee-a652-49f3-a65d-d132a535a7b6 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] No waiting events found dispatching network-vif-plugged-44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:14:21 user nova-compute[71205]: WARNING nova.compute.manager [req-17fbb5c6-5565-4571-a8b0-ed72807b279f req-666654ee-a652-49f3-a65d-d132a535a7b6 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Received unexpected event network-vif-plugged-44f2910d-d36a-479f-8fa0-f16f9a406765 for instance with vm_state building and task_state spawning. Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:14:22 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] VM Resumed (Lifecycle Event) Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:14:22 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Instance spawned successfully. Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:22 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:14:22 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] VM Started (Lifecycle Event) Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:14:22 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:14:22 user nova-compute[71205]: INFO nova.compute.manager [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Took 8.04 seconds to spawn the instance on the hypervisor. Apr 24 00:14:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:14:22 user nova-compute[71205]: INFO nova.compute.manager [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Took 8.99 seconds to build instance. Apr 24 00:14:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-87757731-43de-443d-bf1a-5df699e24203 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.127s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:25 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:14:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:25 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:14:25 user nova-compute[71205]: INFO nova.compute.claims [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Claim successful on node user Apr 24 00:14:26 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:14:26 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:14:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.561s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:26 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:14:26 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:14:26 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:14:26 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:14:26 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:14:26 user nova-compute[71205]: INFO nova.virt.block_device [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Booting with blank volume at /dev/vda Apr 24 00:14:26 user nova-compute[71205]: DEBUG nova.policy [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8d0ab07106dd4995aa7e3f5b6bc70e56', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd26ba1ed4b9241f9a084db1a14a945bb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:14:26 user nova-compute[71205]: WARNING nova.compute.manager [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Volume id: 74021ac9-27f1-4bb6-8e5d-ee21d295af7f finished being created but its status is error. Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume 74021ac9-27f1-4bb6-8e5d-ee21d295af7f did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Traceback (most recent call last): Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] driver_block_device.attach_block_devices( Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] _log_and_attach(device) Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] bdm.attach(*attach_args, **attach_kwargs) Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] File "/opt/stack/nova/nova/virt/block_device.py", line 848, in attach Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] self.volume_id, self.attachment_id = self._create_volume( Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] with excutils.save_and_reraise_exception(): Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] self.force_reraise() Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] raise self.value Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] wait_func(context, volume_id) Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] nova.exception.VolumeNotCreated: Volume 74021ac9-27f1-4bb6-8e5d-ee21d295af7f did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 24 00:14:26 user nova-compute[71205]: ERROR nova.compute.manager [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Apr 24 00:14:27 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Successfully created port: 8cf3a870-6eaa-4c9c-8f70-d2324e755369 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:14:27 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Successfully updated port: 8cf3a870-6eaa-4c9c-8f70-d2324e755369 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:14:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "refresh_cache-f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:14:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquired lock "refresh_cache-f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:14:27 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:14:27 user nova-compute[71205]: DEBUG nova.compute.manager [req-ffdc6fe3-3d70-4e7e-96f1-7403b2c61fa8 req-8762aacf-eb7c-4496-8cbb-0ecba68153c7 service nova] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Received event network-changed-8cf3a870-6eaa-4c9c-8f70-d2324e755369 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:27 user nova-compute[71205]: DEBUG nova.compute.manager [req-ffdc6fe3-3d70-4e7e-96f1-7403b2c61fa8 req-8762aacf-eb7c-4496-8cbb-0ecba68153c7 service nova] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Refreshing instance network info cache due to event network-changed-8cf3a870-6eaa-4c9c-8f70-d2324e755369. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:14:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ffdc6fe3-3d70-4e7e-96f1-7403b2c61fa8 req-8762aacf-eb7c-4496-8cbb-0ecba68153c7 service nova] Acquiring lock "refresh_cache-f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:14:27 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Updating instance_info_cache with network_info: [{"id": "8cf3a870-6eaa-4c9c-8f70-d2324e755369", "address": "fa:16:3e:f0:c8:68", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf3a870-6e", "ovs_interfaceid": "8cf3a870-6eaa-4c9c-8f70-d2324e755369", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Releasing lock "refresh_cache-f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Instance network_info: |[{"id": "8cf3a870-6eaa-4c9c-8f70-d2324e755369", "address": "fa:16:3e:f0:c8:68", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf3a870-6e", "ovs_interfaceid": "8cf3a870-6eaa-4c9c-8f70-d2324e755369", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ffdc6fe3-3d70-4e7e-96f1-7403b2c61fa8 req-8762aacf-eb7c-4496-8cbb-0ecba68153c7 service nova] Acquired lock "refresh_cache-f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.network.neutron [req-ffdc6fe3-3d70-4e7e-96f1-7403b2c61fa8 req-8762aacf-eb7c-4496-8cbb-0ecba68153c7 service nova] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Refreshing network info cache for port 8cf3a870-6eaa-4c9c-8f70-d2324e755369 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.compute.claims [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Aborting claim: {{(pid=71205) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.network.neutron [req-ffdc6fe3-3d70-4e7e-96f1-7403b2c61fa8 req-8762aacf-eb7c-4496-8cbb-0ecba68153c7 service nova] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Updated VIF entry in instance network info cache for port 8cf3a870-6eaa-4c9c-8f70-d2324e755369. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.network.neutron [req-ffdc6fe3-3d70-4e7e-96f1-7403b2c61fa8 req-8762aacf-eb7c-4496-8cbb-0ecba68153c7 service nova] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Updating instance_info_cache with network_info: [{"id": "8cf3a870-6eaa-4c9c-8f70-d2324e755369", "address": "fa:16:3e:f0:c8:68", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf3a870-6e", "ovs_interfaceid": "8cf3a870-6eaa-4c9c-8f70-d2324e755369", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.514s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Build of instance f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e aborted: Volume 74021ac9-27f1-4bb6-8e5d-ee21d295af7f did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.compute.utils [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Build of instance f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e aborted: Volume 74021ac9-27f1-4bb6-8e5d-ee21d295af7f did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71205) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ffdc6fe3-3d70-4e7e-96f1-7403b2c61fa8 req-8762aacf-eb7c-4496-8cbb-0ecba68153c7 service nova] Releasing lock "refresh_cache-f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:14:28 user nova-compute[71205]: ERROR nova.compute.manager [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Build of instance f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e aborted: Volume 74021ac9-27f1-4bb6-8e5d-ee21d295af7f did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e aborted: Volume 74021ac9-27f1-4bb6-8e5d-ee21d295af7f did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Unplugging VIFs for instance {{(pid=71205) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:14:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-238075653',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-238075653',id=15,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d26ba1ed4b9241f9a084db1a14a945bb',ramdisk_id='',reservation_id='r-ml8kpv2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:14:26Z,user_data=None,user_id='8d0ab07106dd4995aa7e3f5b6bc70e56',uuid=f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8cf3a870-6eaa-4c9c-8f70-d2324e755369", "address": "fa:16:3e:f0:c8:68", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf3a870-6e", "ovs_interfaceid": "8cf3a870-6eaa-4c9c-8f70-d2324e755369", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converting VIF {"id": "8cf3a870-6eaa-4c9c-8f70-d2324e755369", "address": "fa:16:3e:f0:c8:68", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cf3a870-6e", "ovs_interfaceid": "8cf3a870-6eaa-4c9c-8f70-d2324e755369", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:c8:68,bridge_name='br-int',has_traffic_filtering=True,id=8cf3a870-6eaa-4c9c-8f70-d2324e755369,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cf3a870-6e') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG os_vif [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:c8:68,bridge_name='br-int',has_traffic_filtering=True,id=8cf3a870-6eaa-4c9c-8f70-d2324e755369,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cf3a870-6e') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cf3a870-6e, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:14:28 user nova-compute[71205]: INFO os_vif [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:c8:68,bridge_name='br-int',has_traffic_filtering=True,id=8cf3a870-6eaa-4c9c-8f70-d2324e755369,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cf3a870-6e') Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Unplugged VIFs for instance {{(pid=71205) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:14:28 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:14:29 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:14:29 user nova-compute[71205]: INFO nova.compute.manager [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e] Took 0.69 seconds to deallocate network for instance. Apr 24 00:14:29 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Deleted allocations for instance f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e Apr 24 00:14:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d365e8a8-c346-49f7-a8b4-5230eceea4f0 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "f2eb4cbc-772c-4e6b-b0a1-4ab8378e841e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 4.175s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:42 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:43 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:43 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:43 user nova-compute[71205]: DEBUG nova.compute.manager [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:14:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:43 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:43 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:43 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:14:43 user nova-compute[71205]: INFO nova.compute.claims [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Claim successful on node user Apr 24 00:14:44 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.594s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG nova.compute.manager [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG nova.compute.manager [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG nova.network.neutron [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:14:44 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:14:44 user nova-compute[71205]: DEBUG nova.compute.manager [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG nova.policy [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '35edcadbe77c4f4fa8304216e7f61d4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '97d1e8a757a746329ea363af81a3c6b4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG nova.compute.manager [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:14:44 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Creating image(s) Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "/opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "/opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "/opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.148s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk 1073741824" returned: 0 in 0.051s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.195s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.144s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Checking if we can resize image /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:14:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Cannot resize image /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG nova.objects.instance [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lazy-loading 'migration_context' on Instance uuid 43f004f3-9b3f-4388-88a8-8eb663ba36a3 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Ensure instance console log exists: /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG nova.network.neutron [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Successfully created port: 389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG nova.network.neutron [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Successfully updated port: 389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "refresh_cache-43f004f3-9b3f-4388-88a8-8eb663ba36a3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquired lock "refresh_cache-43f004f3-9b3f-4388-88a8-8eb663ba36a3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG nova.network.neutron [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG nova.compute.manager [req-66054cd4-f972-493c-b321-c07726f254e1 req-a423375c-56cc-45d8-a767-ed77a5f6791c service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Received event network-changed-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG nova.compute.manager [req-66054cd4-f972-493c-b321-c07726f254e1 req-a423375c-56cc-45d8-a767-ed77a5f6791c service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Refreshing instance network info cache due to event network-changed-389e432f-0336-45fe-b2ff-4ba1ac63e0f3. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-66054cd4-f972-493c-b321-c07726f254e1 req-a423375c-56cc-45d8-a767-ed77a5f6791c service nova] Acquiring lock "refresh_cache-43f004f3-9b3f-4388-88a8-8eb663ba36a3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:14:45 user nova-compute[71205]: DEBUG nova.network.neutron [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.network.neutron [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Updating instance_info_cache with network_info: [{"id": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "address": "fa:16:3e:52:21:72", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap389e432f-03", "ovs_interfaceid": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Releasing lock "refresh_cache-43f004f3-9b3f-4388-88a8-8eb663ba36a3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.compute.manager [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Instance network_info: |[{"id": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "address": "fa:16:3e:52:21:72", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap389e432f-03", "ovs_interfaceid": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-66054cd4-f972-493c-b321-c07726f254e1 req-a423375c-56cc-45d8-a767-ed77a5f6791c service nova] Acquired lock "refresh_cache-43f004f3-9b3f-4388-88a8-8eb663ba36a3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.network.neutron [req-66054cd4-f972-493c-b321-c07726f254e1 req-a423375c-56cc-45d8-a767-ed77a5f6791c service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Refreshing network info cache for port 389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Start _get_guest_xml network_info=[{"id": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "address": "fa:16:3e:52:21:72", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap389e432f-03", "ovs_interfaceid": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:14:46 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:14:46 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:14:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-482287756',display_name='tempest-AttachVolumeNegativeTest-server-482287756',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-482287756',id=16,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFZwHgbIk6e428TAmz7l4fwQyTPgmH1ghkhtoPbQ0yUpoo/9gV6SJaSCP7G0NDgEplyQD28R6/mr35tsYyIgUkJf2JTlCo+FavLRnp0EzEd9fZ37x1cfd9aCy0pyr8ntuw==',key_name='tempest-keypair-1687889936',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='97d1e8a757a746329ea363af81a3c6b4',ramdisk_id='',reservation_id='r-jlbi39yi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-272859998',owner_user_name='tempest-AttachVolumeNegativeTest-272859998-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:14:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35edcadbe77c4f4fa8304216e7f61d4a',uuid=43f004f3-9b3f-4388-88a8-8eb663ba36a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "address": "fa:16:3e:52:21:72", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap389e432f-03", "ovs_interfaceid": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converting VIF {"id": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "address": "fa:16:3e:52:21:72", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap389e432f-03", "ovs_interfaceid": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:21:72,bridge_name='br-int',has_traffic_filtering=True,id=389e432f-0336-45fe-b2ff-4ba1ac63e0f3,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap389e432f-03') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.objects.instance [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lazy-loading 'pci_devices' on Instance uuid 43f004f3-9b3f-4388-88a8-8eb663ba36a3 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] End _get_guest_xml xml= Apr 24 00:14:46 user nova-compute[71205]: 43f004f3-9b3f-4388-88a8-8eb663ba36a3 Apr 24 00:14:46 user nova-compute[71205]: instance-00000010 Apr 24 00:14:46 user nova-compute[71205]: 131072 Apr 24 00:14:46 user nova-compute[71205]: 1 Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: tempest-AttachVolumeNegativeTest-server-482287756 Apr 24 00:14:46 user nova-compute[71205]: 2023-04-24 00:14:46 Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: 128 Apr 24 00:14:46 user nova-compute[71205]: 1 Apr 24 00:14:46 user nova-compute[71205]: 0 Apr 24 00:14:46 user nova-compute[71205]: 0 Apr 24 00:14:46 user nova-compute[71205]: 1 Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: tempest-AttachVolumeNegativeTest-272859998-project-member Apr 24 00:14:46 user nova-compute[71205]: tempest-AttachVolumeNegativeTest-272859998 Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: OpenStack Foundation Apr 24 00:14:46 user nova-compute[71205]: OpenStack Nova Apr 24 00:14:46 user nova-compute[71205]: 0.0.0 Apr 24 00:14:46 user nova-compute[71205]: 43f004f3-9b3f-4388-88a8-8eb663ba36a3 Apr 24 00:14:46 user nova-compute[71205]: 43f004f3-9b3f-4388-88a8-8eb663ba36a3 Apr 24 00:14:46 user nova-compute[71205]: Virtual Machine Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: hvm Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Nehalem Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: /dev/urandom Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: Apr 24 00:14:46 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:14:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-482287756',display_name='tempest-AttachVolumeNegativeTest-server-482287756',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-482287756',id=16,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFZwHgbIk6e428TAmz7l4fwQyTPgmH1ghkhtoPbQ0yUpoo/9gV6SJaSCP7G0NDgEplyQD28R6/mr35tsYyIgUkJf2JTlCo+FavLRnp0EzEd9fZ37x1cfd9aCy0pyr8ntuw==',key_name='tempest-keypair-1687889936',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='97d1e8a757a746329ea363af81a3c6b4',ramdisk_id='',reservation_id='r-jlbi39yi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-272859998',owner_user_name='tempest-AttachVolumeNegativeTest-272859998-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:14:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35edcadbe77c4f4fa8304216e7f61d4a',uuid=43f004f3-9b3f-4388-88a8-8eb663ba36a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "address": "fa:16:3e:52:21:72", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap389e432f-03", "ovs_interfaceid": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converting VIF {"id": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "address": "fa:16:3e:52:21:72", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap389e432f-03", "ovs_interfaceid": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:21:72,bridge_name='br-int',has_traffic_filtering=True,id=389e432f-0336-45fe-b2ff-4ba1ac63e0f3,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap389e432f-03') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG os_vif [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:21:72,bridge_name='br-int',has_traffic_filtering=True,id=389e432f-0336-45fe-b2ff-4ba1ac63e0f3,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap389e432f-03') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap389e432f-03, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap389e432f-03, col_values=(('external_ids', {'iface-id': '389e432f-0336-45fe-b2ff-4ba1ac63e0f3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:21:72', 'vm-uuid': '43f004f3-9b3f-4388-88a8-8eb663ba36a3'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:46 user nova-compute[71205]: INFO os_vif [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:21:72,bridge_name='br-int',has_traffic_filtering=True,id=389e432f-0336-45fe-b2ff-4ba1ac63e0f3,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap389e432f-03') Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] No VIF found with MAC fa:16:3e:52:21:72, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.network.neutron [req-66054cd4-f972-493c-b321-c07726f254e1 req-a423375c-56cc-45d8-a767-ed77a5f6791c service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Updated VIF entry in instance network info cache for port 389e432f-0336-45fe-b2ff-4ba1ac63e0f3. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG nova.network.neutron [req-66054cd4-f972-493c-b321-c07726f254e1 req-a423375c-56cc-45d8-a767-ed77a5f6791c service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Updating instance_info_cache with network_info: [{"id": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "address": "fa:16:3e:52:21:72", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap389e432f-03", "ovs_interfaceid": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:14:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-66054cd4-f972-493c-b321-c07726f254e1 req-a423375c-56cc-45d8-a767-ed77a5f6791c service nova] Releasing lock "refresh_cache-43f004f3-9b3f-4388-88a8-8eb663ba36a3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:14:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:47 user nova-compute[71205]: DEBUG nova.compute.manager [req-19bd8b96-f263-4d2e-80f4-fcbbb9e7c599 req-0586d82d-8fee-4056-b17e-601eed956215 service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Received event network-vif-plugged-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-19bd8b96-f263-4d2e-80f4-fcbbb9e7c599 req-0586d82d-8fee-4056-b17e-601eed956215 service nova] Acquiring lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-19bd8b96-f263-4d2e-80f4-fcbbb9e7c599 req-0586d82d-8fee-4056-b17e-601eed956215 service nova] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-19bd8b96-f263-4d2e-80f4-fcbbb9e7c599 req-0586d82d-8fee-4056-b17e-601eed956215 service nova] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:47 user nova-compute[71205]: DEBUG nova.compute.manager [req-19bd8b96-f263-4d2e-80f4-fcbbb9e7c599 req-0586d82d-8fee-4056-b17e-601eed956215 service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] No waiting events found dispatching network-vif-plugged-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:14:47 user nova-compute[71205]: WARNING nova.compute.manager [req-19bd8b96-f263-4d2e-80f4-fcbbb9e7c599 req-0586d82d-8fee-4056-b17e-601eed956215 service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Received unexpected event network-vif-plugged-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 for instance with vm_state building and task_state spawning. Apr 24 00:14:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:14:49 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] VM Resumed (Lifecycle Event) Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.compute.manager [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:14:49 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Instance spawned successfully. Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:14:49 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:14:49 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] VM Started (Lifecycle Event) Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:14:49 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:14:49 user nova-compute[71205]: INFO nova.compute.manager [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Took 5.49 seconds to spawn the instance on the hypervisor. Apr 24 00:14:49 user nova-compute[71205]: DEBUG nova.compute.manager [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:14:49 user nova-compute[71205]: INFO nova.compute.manager [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Took 6.40 seconds to build instance. Apr 24 00:14:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-22dfeae1-076b-416d-960a-70f97ac41ae2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.501s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:50 user nova-compute[71205]: DEBUG nova.compute.manager [req-31a691b6-5325-4fc9-b7b6-229712e820ff req-6d790fda-9512-4b49-9c80-cf90cb02f335 service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Received event network-vif-plugged-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:50 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-31a691b6-5325-4fc9-b7b6-229712e820ff req-6d790fda-9512-4b49-9c80-cf90cb02f335 service nova] Acquiring lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:50 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-31a691b6-5325-4fc9-b7b6-229712e820ff req-6d790fda-9512-4b49-9c80-cf90cb02f335 service nova] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:50 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-31a691b6-5325-4fc9-b7b6-229712e820ff req-6d790fda-9512-4b49-9c80-cf90cb02f335 service nova] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:50 user nova-compute[71205]: DEBUG nova.compute.manager [req-31a691b6-5325-4fc9-b7b6-229712e820ff req-6d790fda-9512-4b49-9c80-cf90cb02f335 service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] No waiting events found dispatching network-vif-plugged-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:14:50 user nova-compute[71205]: WARNING nova.compute.manager [req-31a691b6-5325-4fc9-b7b6-229712e820ff req-6d790fda-9512-4b49-9c80-cf90cb02f335 service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Received unexpected event network-vif-plugged-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 for instance with vm_state active and task_state None. Apr 24 00:14:51 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Acquiring lock "cf2c88d5-8347-4166-a037-158f29c32d1a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG nova.compute.manager [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:14:56 user nova-compute[71205]: INFO nova.compute.claims [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Claim successful on node user Apr 24 00:14:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.382s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG nova.compute.manager [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG nova.compute.manager [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG nova.network.neutron [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:14:56 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:14:56 user nova-compute[71205]: DEBUG nova.compute.manager [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG nova.policy [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8536b0b100af4c81b0114b37a10ec017', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '211ed190ee5c4a0b98b85960339ea437', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG nova.compute.manager [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:14:56 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Creating image(s) Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Acquiring lock "/opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "/opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "/opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.132s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:56 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk 1073741824" returned: 0 in 0.045s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.183s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.133s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Checking if we can resize image /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Cannot resize image /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG nova.objects.instance [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lazy-loading 'migration_context' on Instance uuid cf2c88d5-8347-4166-a037-158f29c32d1a {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG nova.network.neutron [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Successfully created port: 783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Ensure instance console log exists: /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG nova.network.neutron [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Successfully updated port: 783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Acquiring lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Acquired lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:14:57 user nova-compute[71205]: DEBUG nova.network.neutron [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.compute.manager [req-082148f4-5c38-4a4a-a8ab-984b2f003a63 req-437d4fa3-1139-48d8-a102-696c757d7af9 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received event network-changed-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.compute.manager [req-082148f4-5c38-4a4a-a8ab-984b2f003a63 req-437d4fa3-1139-48d8-a102-696c757d7af9 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Refreshing instance network info cache due to event network-changed-783a3713-64fb-48f3-b3ba-0312249006eb. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-082148f4-5c38-4a4a-a8ab-984b2f003a63 req-437d4fa3-1139-48d8-a102-696c757d7af9 service nova] Acquiring lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.network.neutron [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.network.neutron [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Updating instance_info_cache with network_info: [{"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Releasing lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.compute.manager [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Instance network_info: |[{"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-082148f4-5c38-4a4a-a8ab-984b2f003a63 req-437d4fa3-1139-48d8-a102-696c757d7af9 service nova] Acquired lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.network.neutron [req-082148f4-5c38-4a4a-a8ab-984b2f003a63 req-437d4fa3-1139-48d8-a102-696c757d7af9 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Refreshing network info cache for port 783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Start _get_guest_xml network_info=[{"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:14:58 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:14:58 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:14:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-75602253',display_name='tempest-ServerActionsTestJSON-server-75602253',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-75602253',id=17,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHBH505bABPeY47tqqwKUNL0j9rZei8h+xMYLIM0SjTR/WY833MZWOzZDU3xf+eKarpT1DDNfoU7YX1pgGnkISVtzLfDBcBBJeM+PXJDnAxAnfo3AHLlU/Wx1vdEG2Tqyg==',key_name='tempest-keypair-1442678964',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='211ed190ee5c4a0b98b85960339ea437',ramdisk_id='',reservation_id='r-q1azt7t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1785200975',owner_user_name='tempest-ServerActionsTestJSON-1785200975-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:14:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8536b0b100af4c81b0114b37a10ec017',uuid=cf2c88d5-8347-4166-a037-158f29c32d1a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Converting VIF {"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:78:10,bridge_name='br-int',has_traffic_filtering=True,id=783a3713-64fb-48f3-b3ba-0312249006eb,network=Network(551bb188-8d6d-484a-813f-5f76f0ec0e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap783a3713-64') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.objects.instance [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lazy-loading 'pci_devices' on Instance uuid cf2c88d5-8347-4166-a037-158f29c32d1a {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] End _get_guest_xml xml= Apr 24 00:14:58 user nova-compute[71205]: cf2c88d5-8347-4166-a037-158f29c32d1a Apr 24 00:14:58 user nova-compute[71205]: instance-00000011 Apr 24 00:14:58 user nova-compute[71205]: 131072 Apr 24 00:14:58 user nova-compute[71205]: 1 Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: tempest-ServerActionsTestJSON-server-75602253 Apr 24 00:14:58 user nova-compute[71205]: 2023-04-24 00:14:58 Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: 128 Apr 24 00:14:58 user nova-compute[71205]: 1 Apr 24 00:14:58 user nova-compute[71205]: 0 Apr 24 00:14:58 user nova-compute[71205]: 0 Apr 24 00:14:58 user nova-compute[71205]: 1 Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: tempest-ServerActionsTestJSON-1785200975-project-member Apr 24 00:14:58 user nova-compute[71205]: tempest-ServerActionsTestJSON-1785200975 Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: OpenStack Foundation Apr 24 00:14:58 user nova-compute[71205]: OpenStack Nova Apr 24 00:14:58 user nova-compute[71205]: 0.0.0 Apr 24 00:14:58 user nova-compute[71205]: cf2c88d5-8347-4166-a037-158f29c32d1a Apr 24 00:14:58 user nova-compute[71205]: cf2c88d5-8347-4166-a037-158f29c32d1a Apr 24 00:14:58 user nova-compute[71205]: Virtual Machine Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: hvm Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Nehalem Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: /dev/urandom Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: Apr 24 00:14:58 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:14:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-75602253',display_name='tempest-ServerActionsTestJSON-server-75602253',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-75602253',id=17,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHBH505bABPeY47tqqwKUNL0j9rZei8h+xMYLIM0SjTR/WY833MZWOzZDU3xf+eKarpT1DDNfoU7YX1pgGnkISVtzLfDBcBBJeM+PXJDnAxAnfo3AHLlU/Wx1vdEG2Tqyg==',key_name='tempest-keypair-1442678964',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='211ed190ee5c4a0b98b85960339ea437',ramdisk_id='',reservation_id='r-q1azt7t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1785200975',owner_user_name='tempest-ServerActionsTestJSON-1785200975-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:14:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8536b0b100af4c81b0114b37a10ec017',uuid=cf2c88d5-8347-4166-a037-158f29c32d1a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Converting VIF {"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:78:10,bridge_name='br-int',has_traffic_filtering=True,id=783a3713-64fb-48f3-b3ba-0312249006eb,network=Network(551bb188-8d6d-484a-813f-5f76f0ec0e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap783a3713-64') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG os_vif [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:78:10,bridge_name='br-int',has_traffic_filtering=True,id=783a3713-64fb-48f3-b3ba-0312249006eb,network=Network(551bb188-8d6d-484a-813f-5f76f0ec0e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap783a3713-64') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap783a3713-64, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap783a3713-64, col_values=(('external_ids', {'iface-id': '783a3713-64fb-48f3-b3ba-0312249006eb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:78:10', 'vm-uuid': 'cf2c88d5-8347-4166-a037-158f29c32d1a'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:58 user nova-compute[71205]: INFO os_vif [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:78:10,bridge_name='br-int',has_traffic_filtering=True,id=783a3713-64fb-48f3-b3ba-0312249006eb,network=Network(551bb188-8d6d-484a-813f-5f76f0ec0e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap783a3713-64') Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] No VIF found with MAC fa:16:3e:87:78:10, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.network.neutron [req-082148f4-5c38-4a4a-a8ab-984b2f003a63 req-437d4fa3-1139-48d8-a102-696c757d7af9 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Updated VIF entry in instance network info cache for port 783a3713-64fb-48f3-b3ba-0312249006eb. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG nova.network.neutron [req-082148f4-5c38-4a4a-a8ab-984b2f003a63 req-437d4fa3-1139-48d8-a102-696c757d7af9 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Updating instance_info_cache with network_info: [{"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:14:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-082148f4-5c38-4a4a-a8ab-984b2f003a63 req-437d4fa3-1139-48d8-a102-696c757d7af9 service nova] Releasing lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:14:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:14:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:00 user nova-compute[71205]: DEBUG nova.compute.manager [req-f0a0c9a7-bb91-4554-8441-d129056c3286 req-3b954229-755a-43ad-8a1d-0970fa805ec9 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received event network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:15:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f0a0c9a7-bb91-4554-8441-d129056c3286 req-3b954229-755a-43ad-8a1d-0970fa805ec9 service nova] Acquiring lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f0a0c9a7-bb91-4554-8441-d129056c3286 req-3b954229-755a-43ad-8a1d-0970fa805ec9 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f0a0c9a7-bb91-4554-8441-d129056c3286 req-3b954229-755a-43ad-8a1d-0970fa805ec9 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:00 user nova-compute[71205]: DEBUG nova.compute.manager [req-f0a0c9a7-bb91-4554-8441-d129056c3286 req-3b954229-755a-43ad-8a1d-0970fa805ec9 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] No waiting events found dispatching network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:15:00 user nova-compute[71205]: WARNING nova.compute.manager [req-f0a0c9a7-bb91-4554-8441-d129056c3286 req-3b954229-755a-43ad-8a1d-0970fa805ec9 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received unexpected event network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb for instance with vm_state building and task_state spawning. Apr 24 00:15:00 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:00 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:00 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:00 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:01 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:15:01 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] VM Resumed (Lifecycle Event) Apr 24 00:15:01 user nova-compute[71205]: DEBUG nova.compute.manager [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:15:01 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:15:01 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Instance spawned successfully. Apr 24 00:15:01 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:15:02 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:15:02 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] VM Started (Lifecycle Event) Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:15:02 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:15:02 user nova-compute[71205]: INFO nova.compute.manager [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Took 5.41 seconds to spawn the instance on the hypervisor. Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.compute.manager [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.compute.manager [req-da97c8f6-9994-47ad-aea9-4a4a3637a26e req-2f29a694-ec71-4258-92ee-fe005d1b4b8b service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received event network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-da97c8f6-9994-47ad-aea9-4a4a3637a26e req-2f29a694-ec71-4258-92ee-fe005d1b4b8b service nova] Acquiring lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-da97c8f6-9994-47ad-aea9-4a4a3637a26e req-2f29a694-ec71-4258-92ee-fe005d1b4b8b service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-da97c8f6-9994-47ad-aea9-4a4a3637a26e req-2f29a694-ec71-4258-92ee-fe005d1b4b8b service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:02 user nova-compute[71205]: DEBUG nova.compute.manager [req-da97c8f6-9994-47ad-aea9-4a4a3637a26e req-2f29a694-ec71-4258-92ee-fe005d1b4b8b service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] No waiting events found dispatching network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:15:02 user nova-compute[71205]: WARNING nova.compute.manager [req-da97c8f6-9994-47ad-aea9-4a4a3637a26e req-2f29a694-ec71-4258-92ee-fe005d1b4b8b service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received unexpected event network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb for instance with vm_state building and task_state spawning. Apr 24 00:15:02 user nova-compute[71205]: INFO nova.compute.manager [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Took 6.12 seconds to build instance. Apr 24 00:15:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-df9c089c-e22f-4894-a8d8-1cc2c58132f1 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.205s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:03 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:03 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "1821ecf2-8c71-48ad-96da-f63b83439c6d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:05 user nova-compute[71205]: INFO nova.compute.manager [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Terminating instance Apr 24 00:15:05 user nova-compute[71205]: DEBUG nova.compute.manager [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG nova.compute.manager [req-404ee0d9-99e8-4ed5-b0e5-22f63a0ffd04 req-20537d46-847a-4b57-9608-14249578405f service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Received event network-vif-unplugged-86355003-a71c-4c4f-9536-beab8f09ded2 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-404ee0d9-99e8-4ed5-b0e5-22f63a0ffd04 req-20537d46-847a-4b57-9608-14249578405f service nova] Acquiring lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-404ee0d9-99e8-4ed5-b0e5-22f63a0ffd04 req-20537d46-847a-4b57-9608-14249578405f service nova] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-404ee0d9-99e8-4ed5-b0e5-22f63a0ffd04 req-20537d46-847a-4b57-9608-14249578405f service nova] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG nova.compute.manager [req-404ee0d9-99e8-4ed5-b0e5-22f63a0ffd04 req-20537d46-847a-4b57-9608-14249578405f service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] No waiting events found dispatching network-vif-unplugged-86355003-a71c-4c4f-9536-beab8f09ded2 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG nova.compute.manager [req-404ee0d9-99e8-4ed5-b0e5-22f63a0ffd04 req-20537d46-847a-4b57-9608-14249578405f service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Received event network-vif-unplugged-86355003-a71c-4c4f-9536-beab8f09ded2 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:06 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Instance destroyed successfully. Apr 24 00:15:06 user nova-compute[71205]: DEBUG nova.objects.instance [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lazy-loading 'resources' on Instance uuid 1821ecf2-8c71-48ad-96da-f63b83439c6d {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:15:06 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:13:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1213468308',display_name='tempest-ServersNegativeTestJSON-server-1213468308',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1213468308',id=11,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-24T00:13:22Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='ce75f63fc0904eceb03e8319bddba4d3',ramdisk_id='',reservation_id='r-o0umdweg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-380105770',owner_user_name='tempest-ServersNegativeTestJSON-380105770-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:13:23Z,user_data=None,user_id='abae98323deb44dea0622186485cc7af',uuid=1821ecf2-8c71-48ad-96da-f63b83439c6d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "86355003-a71c-4c4f-9536-beab8f09ded2", "address": "fa:16:3e:fb:43:8b", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap86355003-a7", "ovs_interfaceid": "86355003-a71c-4c4f-9536-beab8f09ded2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:15:06 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Converting VIF {"id": "86355003-a71c-4c4f-9536-beab8f09ded2", "address": "fa:16:3e:fb:43:8b", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap86355003-a7", "ovs_interfaceid": "86355003-a71c-4c4f-9536-beab8f09ded2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:15:06 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:43:8b,bridge_name='br-int',has_traffic_filtering=True,id=86355003-a71c-4c4f-9536-beab8f09ded2,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86355003-a7') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:15:06 user nova-compute[71205]: DEBUG os_vif [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:43:8b,bridge_name='br-int',has_traffic_filtering=True,id=86355003-a71c-4c4f-9536-beab8f09ded2,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86355003-a7') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:15:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap86355003-a7, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:15:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:06 user nova-compute[71205]: INFO os_vif [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:43:8b,bridge_name='br-int',has_traffic_filtering=True,id=86355003-a71c-4c4f-9536-beab8f09ded2,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86355003-a7') Apr 24 00:15:06 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Deleting instance files /opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d_del Apr 24 00:15:06 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Deletion of /opt/stack/data/nova/instances/1821ecf2-8c71-48ad-96da-f63b83439c6d_del complete Apr 24 00:15:06 user nova-compute[71205]: INFO nova.compute.manager [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Took 0.79 seconds to destroy the instance on the hypervisor. Apr 24 00:15:06 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:15:06 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:15:06 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:15:07 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:15:07 user nova-compute[71205]: DEBUG nova.compute.manager [req-29adbc7c-eec6-408b-b3a6-5b3df82fc4cc req-b8a8c96b-9e5b-4403-ad13-ef6d616efbd3 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Received event network-vif-deleted-86355003-a71c-4c4f-9536-beab8f09ded2 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:15:07 user nova-compute[71205]: INFO nova.compute.manager [req-29adbc7c-eec6-408b-b3a6-5b3df82fc4cc req-b8a8c96b-9e5b-4403-ad13-ef6d616efbd3 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Neutron deleted interface 86355003-a71c-4c4f-9536-beab8f09ded2; detaching it from the instance and deleting it from the info cache Apr 24 00:15:07 user nova-compute[71205]: DEBUG nova.network.neutron [req-29adbc7c-eec6-408b-b3a6-5b3df82fc4cc req-b8a8c96b-9e5b-4403-ad13-ef6d616efbd3 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:15:07 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Took 1.18 seconds to deallocate network for instance. Apr 24 00:15:07 user nova-compute[71205]: DEBUG nova.compute.manager [req-29adbc7c-eec6-408b-b3a6-5b3df82fc4cc req-b8a8c96b-9e5b-4403-ad13-ef6d616efbd3 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Detach interface failed, port_id=86355003-a71c-4c4f-9536-beab8f09ded2, reason: Instance 1821ecf2-8c71-48ad-96da-f63b83439c6d could not be found. {{(pid=71205) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 24 00:15:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:07 user nova-compute[71205]: DEBUG nova.compute.manager [req-42475c89-3084-4a2f-bcc9-9e55c2d462c8 req-96782808-6efe-4392-8263-ab6b5ce56894 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Received event network-vif-plugged-86355003-a71c-4c4f-9536-beab8f09ded2 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:15:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-42475c89-3084-4a2f-bcc9-9e55c2d462c8 req-96782808-6efe-4392-8263-ab6b5ce56894 service nova] Acquiring lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-42475c89-3084-4a2f-bcc9-9e55c2d462c8 req-96782808-6efe-4392-8263-ab6b5ce56894 service nova] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-42475c89-3084-4a2f-bcc9-9e55c2d462c8 req-96782808-6efe-4392-8263-ab6b5ce56894 service nova] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:07 user nova-compute[71205]: DEBUG nova.compute.manager [req-42475c89-3084-4a2f-bcc9-9e55c2d462c8 req-96782808-6efe-4392-8263-ab6b5ce56894 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] No waiting events found dispatching network-vif-plugged-86355003-a71c-4c4f-9536-beab8f09ded2 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:15:07 user nova-compute[71205]: WARNING nova.compute.manager [req-42475c89-3084-4a2f-bcc9-9e55c2d462c8 req-96782808-6efe-4392-8263-ab6b5ce56894 service nova] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Received unexpected event network-vif-plugged-86355003-a71c-4c4f-9536-beab8f09ded2 for instance with vm_state deleted and task_state None. Apr 24 00:15:07 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:15:07 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:15:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.354s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:07 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Deleted allocations for instance 1821ecf2-8c71-48ad-96da-f63b83439c6d Apr 24 00:15:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9069d4cd-47e7-469f-8357-ed5bf0d2943f tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "1821ecf2-8c71-48ad-96da-f63b83439c6d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.524s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:11 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:15:11 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:15:11 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:15:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:15:12 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:15:12 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:15:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:12 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:15:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue --force-share --output=json" returned: 0 in 0.140s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue --force-share --output=json" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.178s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:15 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:16 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:15:16 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=7908MB free_disk=26.514019012451172GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ce19423d-a6ee-4506-9cd1-ec4803abdd86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ac38bbc2-2229-4497-b501-e9230ec59a32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance f1a14b79-7792-4962-bbe1-ec11e10e6948 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance f2a3766c-0a08-4eb5-a833-e39eb73d3426 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 38c59ffc-494a-4d2b-a199-226a6e7cc683 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance eb48c285-06f8-4d48-a550-c3fb3c05e93a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 43f004f3-9b3f-4388-88a8-8eb663ba36a3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance cf2c88d5-8347-4166-a037-158f29c32d1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 10 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=1792MB phys_disk=40GB used_disk=10GB total_vcpus=12 used_vcpus=10 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:15:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.425s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "1c91af3a-b837-4ff0-a236-3483ffe5277d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:15:17 user nova-compute[71205]: INFO nova.compute.claims [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Claim successful on node user Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.414s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG nova.network.neutron [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:15:17 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:15:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG nova.policy [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8d0ab07106dd4995aa7e3f5b6bc70e56', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd26ba1ed4b9241f9a084db1a14a945bb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:15:17 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Creating image(s) Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "/opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "/opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "/opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.131s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Updating instance_info_cache with network_info: [{"id": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "address": "fa:16:3e:b2:2c:52", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.15", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6f98d8d-f9", "ovs_interfaceid": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.136s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk 1073741824" returned: 0 in 0.049s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.188s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.130s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Checking if we can resize image /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Cannot resize image /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG nova.objects.instance [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lazy-loading 'migration_context' on Instance uuid 1c91af3a-b837-4ff0-a236-3483ffe5277d {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Ensure instance console log exists: /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:18 user nova-compute[71205]: DEBUG nova.network.neutron [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Successfully created port: 7cef5b8b-d733-4f6a-8567-4b9e7e650fbb {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.network.neutron [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Successfully updated port: 7cef5b8b-d733-4f6a-8567-4b9e7e650fbb {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "refresh_cache-1c91af3a-b837-4ff0-a236-3483ffe5277d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquired lock "refresh_cache-1c91af3a-b837-4ff0-a236-3483ffe5277d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.network.neutron [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-d673b6eb-72ad-48d6-b7d2-a0dbbcd51f7f req-2a93cd83-48c4-4e97-8ce8-7f2aef28423a service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Received event network-changed-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.compute.manager [req-d673b6eb-72ad-48d6-b7d2-a0dbbcd51f7f req-2a93cd83-48c4-4e97-8ce8-7f2aef28423a service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Refreshing instance network info cache due to event network-changed-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d673b6eb-72ad-48d6-b7d2-a0dbbcd51f7f req-2a93cd83-48c4-4e97-8ce8-7f2aef28423a service nova] Acquiring lock "refresh_cache-1c91af3a-b837-4ff0-a236-3483ffe5277d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.network.neutron [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.network.neutron [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Updating instance_info_cache with network_info: [{"id": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "address": "fa:16:3e:ab:a9:8e", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cef5b8b-d7", "ovs_interfaceid": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Releasing lock "refresh_cache-1c91af3a-b837-4ff0-a236-3483ffe5277d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Instance network_info: |[{"id": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "address": "fa:16:3e:ab:a9:8e", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cef5b8b-d7", "ovs_interfaceid": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d673b6eb-72ad-48d6-b7d2-a0dbbcd51f7f req-2a93cd83-48c4-4e97-8ce8-7f2aef28423a service nova] Acquired lock "refresh_cache-1c91af3a-b837-4ff0-a236-3483ffe5277d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.network.neutron [req-d673b6eb-72ad-48d6-b7d2-a0dbbcd51f7f req-2a93cd83-48c4-4e97-8ce8-7f2aef28423a service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Refreshing network info cache for port 7cef5b8b-d733-4f6a-8567-4b9e7e650fbb {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Start _get_guest_xml network_info=[{"id": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "address": "fa:16:3e:ab:a9:8e", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cef5b8b-d7", "ovs_interfaceid": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:15:19 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:15:19 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:15:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-515303589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-515303589',id=18,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d26ba1ed4b9241f9a084db1a14a945bb',ramdisk_id='',reservation_id='r-b2izt7wx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:15:18Z,user_data=None,user_id='8d0ab07106dd4995aa7e3f5b6bc70e56',uuid=1c91af3a-b837-4ff0-a236-3483ffe5277d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "address": "fa:16:3e:ab:a9:8e", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cef5b8b-d7", "ovs_interfaceid": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converting VIF {"id": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "address": "fa:16:3e:ab:a9:8e", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cef5b8b-d7", "ovs_interfaceid": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:a9:8e,bridge_name='br-int',has_traffic_filtering=True,id=7cef5b8b-d733-4f6a-8567-4b9e7e650fbb,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cef5b8b-d7') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.objects.instance [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lazy-loading 'pci_devices' on Instance uuid 1c91af3a-b837-4ff0-a236-3483ffe5277d {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] End _get_guest_xml xml= Apr 24 00:15:19 user nova-compute[71205]: 1c91af3a-b837-4ff0-a236-3483ffe5277d Apr 24 00:15:19 user nova-compute[71205]: instance-00000012 Apr 24 00:15:19 user nova-compute[71205]: 131072 Apr 24 00:15:19 user nova-compute[71205]: 1 Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: tempest-ServerBootFromVolumeStableRescueTest-server-515303589 Apr 24 00:15:19 user nova-compute[71205]: 2023-04-24 00:15:19 Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: 128 Apr 24 00:15:19 user nova-compute[71205]: 1 Apr 24 00:15:19 user nova-compute[71205]: 0 Apr 24 00:15:19 user nova-compute[71205]: 0 Apr 24 00:15:19 user nova-compute[71205]: 1 Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member Apr 24 00:15:19 user nova-compute[71205]: tempest-ServerBootFromVolumeStableRescueTest-2021792443 Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: OpenStack Foundation Apr 24 00:15:19 user nova-compute[71205]: OpenStack Nova Apr 24 00:15:19 user nova-compute[71205]: 0.0.0 Apr 24 00:15:19 user nova-compute[71205]: 1c91af3a-b837-4ff0-a236-3483ffe5277d Apr 24 00:15:19 user nova-compute[71205]: 1c91af3a-b837-4ff0-a236-3483ffe5277d Apr 24 00:15:19 user nova-compute[71205]: Virtual Machine Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: hvm Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Nehalem Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: /dev/urandom Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: Apr 24 00:15:19 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:15:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-515303589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-515303589',id=18,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d26ba1ed4b9241f9a084db1a14a945bb',ramdisk_id='',reservation_id='r-b2izt7wx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:15:18Z,user_data=None,user_id='8d0ab07106dd4995aa7e3f5b6bc70e56',uuid=1c91af3a-b837-4ff0-a236-3483ffe5277d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "address": "fa:16:3e:ab:a9:8e", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cef5b8b-d7", "ovs_interfaceid": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converting VIF {"id": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "address": "fa:16:3e:ab:a9:8e", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cef5b8b-d7", "ovs_interfaceid": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:a9:8e,bridge_name='br-int',has_traffic_filtering=True,id=7cef5b8b-d733-4f6a-8567-4b9e7e650fbb,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cef5b8b-d7') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG os_vif [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:a9:8e,bridge_name='br-int',has_traffic_filtering=True,id=7cef5b8b-d733-4f6a-8567-4b9e7e650fbb,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cef5b8b-d7') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7cef5b8b-d7, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7cef5b8b-d7, col_values=(('external_ids', {'iface-id': '7cef5b8b-d733-4f6a-8567-4b9e7e650fbb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:a9:8e', 'vm-uuid': '1c91af3a-b837-4ff0-a236-3483ffe5277d'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:19 user nova-compute[71205]: INFO os_vif [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:a9:8e,bridge_name='br-int',has_traffic_filtering=True,id=7cef5b8b-d733-4f6a-8567-4b9e7e650fbb,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cef5b8b-d7') Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:15:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] No VIF found with MAC fa:16:3e:ab:a9:8e, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:15:20 user nova-compute[71205]: DEBUG nova.network.neutron [req-d673b6eb-72ad-48d6-b7d2-a0dbbcd51f7f req-2a93cd83-48c4-4e97-8ce8-7f2aef28423a service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Updated VIF entry in instance network info cache for port 7cef5b8b-d733-4f6a-8567-4b9e7e650fbb. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:15:20 user nova-compute[71205]: DEBUG nova.network.neutron [req-d673b6eb-72ad-48d6-b7d2-a0dbbcd51f7f req-2a93cd83-48c4-4e97-8ce8-7f2aef28423a service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Updating instance_info_cache with network_info: [{"id": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "address": "fa:16:3e:ab:a9:8e", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cef5b8b-d7", "ovs_interfaceid": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:15:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d673b6eb-72ad-48d6-b7d2-a0dbbcd51f7f req-2a93cd83-48c4-4e97-8ce8-7f2aef28423a service nova] Releasing lock "refresh_cache-1c91af3a-b837-4ff0-a236-3483ffe5277d" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:15:20 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:15:20 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] VM Stopped (Lifecycle Event) Apr 24 00:15:20 user nova-compute[71205]: DEBUG nova.compute.manager [None req-e3429be9-b43e-45ad-aa69-db8677c308e8 None None] [instance: 1821ecf2-8c71-48ad-96da-f63b83439c6d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:15:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:21 user nova-compute[71205]: DEBUG nova.compute.manager [req-f860c3c2-ee8a-4fda-ab61-c35ee73e8b24 req-2e6a735b-d26d-4cb6-b609-ea50fa9e997f service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Received event network-vif-plugged-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:15:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f860c3c2-ee8a-4fda-ab61-c35ee73e8b24 req-2e6a735b-d26d-4cb6-b609-ea50fa9e997f service nova] Acquiring lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f860c3c2-ee8a-4fda-ab61-c35ee73e8b24 req-2e6a735b-d26d-4cb6-b609-ea50fa9e997f service nova] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f860c3c2-ee8a-4fda-ab61-c35ee73e8b24 req-2e6a735b-d26d-4cb6-b609-ea50fa9e997f service nova] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:21 user nova-compute[71205]: DEBUG nova.compute.manager [req-f860c3c2-ee8a-4fda-ab61-c35ee73e8b24 req-2e6a735b-d26d-4cb6-b609-ea50fa9e997f service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] No waiting events found dispatching network-vif-plugged-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:15:21 user nova-compute[71205]: WARNING nova.compute.manager [req-f860c3c2-ee8a-4fda-ab61-c35ee73e8b24 req-2e6a735b-d26d-4cb6-b609-ea50fa9e997f service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Received unexpected event network-vif-plugged-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb for instance with vm_state building and task_state spawning. Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:15:23 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] VM Resumed (Lifecycle Event) Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:15:23 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Instance spawned successfully. Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:15:23 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:15:23 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] VM Started (Lifecycle Event) Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:15:23 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:15:23 user nova-compute[71205]: INFO nova.compute.manager [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Took 5.58 seconds to spawn the instance on the hypervisor. Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.compute.manager [req-689f9281-91ca-4431-9f10-f1c973b301dc req-b04bf49a-6f5d-4696-a6f0-d5e1de47a4b2 service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Received event network-vif-plugged-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-689f9281-91ca-4431-9f10-f1c973b301dc req-b04bf49a-6f5d-4696-a6f0-d5e1de47a4b2 service nova] Acquiring lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-689f9281-91ca-4431-9f10-f1c973b301dc req-b04bf49a-6f5d-4696-a6f0-d5e1de47a4b2 service nova] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-689f9281-91ca-4431-9f10-f1c973b301dc req-b04bf49a-6f5d-4696-a6f0-d5e1de47a4b2 service nova] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:23 user nova-compute[71205]: DEBUG nova.compute.manager [req-689f9281-91ca-4431-9f10-f1c973b301dc req-b04bf49a-6f5d-4696-a6f0-d5e1de47a4b2 service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] No waiting events found dispatching network-vif-plugged-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:15:23 user nova-compute[71205]: WARNING nova.compute.manager [req-689f9281-91ca-4431-9f10-f1c973b301dc req-b04bf49a-6f5d-4696-a6f0-d5e1de47a4b2 service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Received unexpected event network-vif-plugged-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb for instance with vm_state building and task_state spawning. Apr 24 00:15:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:23 user nova-compute[71205]: INFO nova.compute.manager [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Took 6.31 seconds to build instance. Apr 24 00:15:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-36066562-5536-47c6-bf2d-ca546b35362a tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.407s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:32 user nova-compute[71205]: INFO nova.compute.manager [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Terminating instance Apr 24 00:15:32 user nova-compute[71205]: DEBUG nova.compute.manager [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG nova.compute.manager [req-ec29d3b1-5b17-4f70-8f7c-f814f72ece03 req-7832a4e4-39b4-47ee-93ea-b5a6fc20aae9 service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Received event network-vif-unplugged-5b300cf4-3c09-440b-8992-08ebf1a3d958 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ec29d3b1-5b17-4f70-8f7c-f814f72ece03 req-7832a4e4-39b4-47ee-93ea-b5a6fc20aae9 service nova] Acquiring lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ec29d3b1-5b17-4f70-8f7c-f814f72ece03 req-7832a4e4-39b4-47ee-93ea-b5a6fc20aae9 service nova] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ec29d3b1-5b17-4f70-8f7c-f814f72ece03 req-7832a4e4-39b4-47ee-93ea-b5a6fc20aae9 service nova] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG nova.compute.manager [req-ec29d3b1-5b17-4f70-8f7c-f814f72ece03 req-7832a4e4-39b4-47ee-93ea-b5a6fc20aae9 service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] No waiting events found dispatching network-vif-unplugged-5b300cf4-3c09-440b-8992-08ebf1a3d958 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG nova.compute.manager [req-ec29d3b1-5b17-4f70-8f7c-f814f72ece03 req-7832a4e4-39b4-47ee-93ea-b5a6fc20aae9 service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Received event network-vif-unplugged-5b300cf4-3c09-440b-8992-08ebf1a3d958 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:32 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:33 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Instance destroyed successfully. Apr 24 00:15:33 user nova-compute[71205]: DEBUG nova.objects.instance [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lazy-loading 'resources' on Instance uuid f2a3766c-0a08-4eb5-a833-e39eb73d3426 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:13:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-228310302',display_name='tempest-VolumesAdminNegativeTest-server-228310302',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-228310302',id=12,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-24T00:13:47Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='df0187dbb10d42da941645107df203f6',ramdisk_id='',reservation_id='r-ilhhrvbt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-821594302',owner_user_name='tempest-VolumesAdminNegativeTest-821594302-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:13:47Z,user_data=None,user_id='640ec20e46a2422a8aabcc152e522e02',uuid=f2a3766c-0a08-4eb5-a833-e39eb73d3426,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "address": "fa:16:3e:e3:2d:d1", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b300cf4-3c", "ovs_interfaceid": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Converting VIF {"id": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "address": "fa:16:3e:e3:2d:d1", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b300cf4-3c", "ovs_interfaceid": "5b300cf4-3c09-440b-8992-08ebf1a3d958", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2d:d1,bridge_name='br-int',has_traffic_filtering=True,id=5b300cf4-3c09-440b-8992-08ebf1a3d958,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b300cf4-3c') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG os_vif [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2d:d1,bridge_name='br-int',has_traffic_filtering=True,id=5b300cf4-3c09-440b-8992-08ebf1a3d958,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b300cf4-3c') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b300cf4-3c, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:15:33 user nova-compute[71205]: INFO os_vif [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:2d:d1,bridge_name='br-int',has_traffic_filtering=True,id=5b300cf4-3c09-440b-8992-08ebf1a3d958,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b300cf4-3c') Apr 24 00:15:33 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Deleting instance files /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426_del Apr 24 00:15:33 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Deletion of /opt/stack/data/nova/instances/f2a3766c-0a08-4eb5-a833-e39eb73d3426_del complete Apr 24 00:15:33 user nova-compute[71205]: INFO nova.compute.manager [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Took 0.64 seconds to destroy the instance on the hypervisor. Apr 24 00:15:33 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:15:33 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Took 0.55 seconds to deallocate network for instance. Apr 24 00:15:33 user nova-compute[71205]: DEBUG nova.compute.manager [req-1e4397f6-eeb6-44cf-9b39-cde21836d359 req-af92ee75-703d-4f37-87b7-02f5ac3c3b3b service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Received event network-vif-deleted-5b300cf4-3c09-440b-8992-08ebf1a3d958 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:15:33 user nova-compute[71205]: INFO nova.compute.manager [req-1e4397f6-eeb6-44cf-9b39-cde21836d359 req-af92ee75-703d-4f37-87b7-02f5ac3c3b3b service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Neutron deleted interface 5b300cf4-3c09-440b-8992-08ebf1a3d958; detaching it from the instance and deleting it from the info cache Apr 24 00:15:33 user nova-compute[71205]: DEBUG nova.network.neutron [req-1e4397f6-eeb6-44cf-9b39-cde21836d359 req-af92ee75-703d-4f37-87b7-02f5ac3c3b3b service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG nova.compute.manager [req-1e4397f6-eeb6-44cf-9b39-cde21836d359 req-af92ee75-703d-4f37-87b7-02f5ac3c3b3b service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Detach interface failed, port_id=5b300cf4-3c09-440b-8992-08ebf1a3d958, reason: Instance f2a3766c-0a08-4eb5-a833-e39eb73d3426 could not be found. {{(pid=71205) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:33 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:34 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:15:34 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:15:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.314s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:34 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Deleted allocations for instance f2a3766c-0a08-4eb5-a833-e39eb73d3426 Apr 24 00:15:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-942b647b-3a46-44d3-a60e-0a03d1719b1f tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.728s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:34 user nova-compute[71205]: DEBUG nova.compute.manager [req-9099e8cf-00c4-4de3-89bf-2dc6c8b13faa req-53c96b54-a4ae-4c3b-8b6b-eb451bf57cc2 service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Received event network-vif-plugged-5b300cf4-3c09-440b-8992-08ebf1a3d958 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:15:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-9099e8cf-00c4-4de3-89bf-2dc6c8b13faa req-53c96b54-a4ae-4c3b-8b6b-eb451bf57cc2 service nova] Acquiring lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-9099e8cf-00c4-4de3-89bf-2dc6c8b13faa req-53c96b54-a4ae-4c3b-8b6b-eb451bf57cc2 service nova] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-9099e8cf-00c4-4de3-89bf-2dc6c8b13faa req-53c96b54-a4ae-4c3b-8b6b-eb451bf57cc2 service nova] Lock "f2a3766c-0a08-4eb5-a833-e39eb73d3426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:34 user nova-compute[71205]: DEBUG nova.compute.manager [req-9099e8cf-00c4-4de3-89bf-2dc6c8b13faa req-53c96b54-a4ae-4c3b-8b6b-eb451bf57cc2 service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] No waiting events found dispatching network-vif-plugged-5b300cf4-3c09-440b-8992-08ebf1a3d958 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:15:34 user nova-compute[71205]: WARNING nova.compute.manager [req-9099e8cf-00c4-4de3-89bf-2dc6c8b13faa req-53c96b54-a4ae-4c3b-8b6b-eb451bf57cc2 service nova] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Received unexpected event network-vif-plugged-5b300cf4-3c09-440b-8992-08ebf1a3d958 for instance with vm_state deleted and task_state None. Apr 24 00:15:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:48 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:15:48 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] VM Stopped (Lifecycle Event) Apr 24 00:15:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-ff892ff1-7b26-40bd-9024-3afbfee3fb00 None None] [instance: f2a3766c-0a08-4eb5-a833-e39eb73d3426] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:15:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:56 user nova-compute[71205]: DEBUG nova.compute.manager [req-4a8cc661-d5e5-482c-a7ac-77a8f8eddb39 req-6a82e9f5-52c1-4d83-ab3e-27622f0ffcde service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Received event network-changed-e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:15:56 user nova-compute[71205]: DEBUG nova.compute.manager [req-4a8cc661-d5e5-482c-a7ac-77a8f8eddb39 req-6a82e9f5-52c1-4d83-ab3e-27622f0ffcde service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Refreshing instance network info cache due to event network-changed-e9b64819-4e5d-4a42-aa9c-d460bb336094. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:15:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-4a8cc661-d5e5-482c-a7ac-77a8f8eddb39 req-6a82e9f5-52c1-4d83-ab3e-27622f0ffcde service nova] Acquiring lock "refresh_cache-38c59ffc-494a-4d2b-a199-226a6e7cc683" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:15:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-4a8cc661-d5e5-482c-a7ac-77a8f8eddb39 req-6a82e9f5-52c1-4d83-ab3e-27622f0ffcde service nova] Acquired lock "refresh_cache-38c59ffc-494a-4d2b-a199-226a6e7cc683" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:15:56 user nova-compute[71205]: DEBUG nova.network.neutron [req-4a8cc661-d5e5-482c-a7ac-77a8f8eddb39 req-6a82e9f5-52c1-4d83-ab3e-27622f0ffcde service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Refreshing network info cache for port e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:15:57 user nova-compute[71205]: DEBUG nova.network.neutron [req-4a8cc661-d5e5-482c-a7ac-77a8f8eddb39 req-6a82e9f5-52c1-4d83-ab3e-27622f0ffcde service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Updated VIF entry in instance network info cache for port e9b64819-4e5d-4a42-aa9c-d460bb336094. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:15:57 user nova-compute[71205]: DEBUG nova.network.neutron [req-4a8cc661-d5e5-482c-a7ac-77a8f8eddb39 req-6a82e9f5-52c1-4d83-ab3e-27622f0ffcde service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Updating instance_info_cache with network_info: [{"id": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "address": "fa:16:3e:66:d7:60", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.109", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape9b64819-4e", "ovs_interfaceid": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:15:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-4a8cc661-d5e5-482c-a7ac-77a8f8eddb39 req-6a82e9f5-52c1-4d83-ab3e-27622f0ffcde service nova] Releasing lock "refresh_cache-38c59ffc-494a-4d2b-a199-226a6e7cc683" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "38c59ffc-494a-4d2b-a199-226a6e7cc683" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:58 user nova-compute[71205]: INFO nova.compute.manager [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Terminating instance Apr 24 00:15:58 user nova-compute[71205]: DEBUG nova.compute.manager [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG nova.compute.manager [req-df49cea3-a8d3-40cb-8b32-3605cb146bf5 req-0eaaa617-62cf-4862-9837-edb8a603b306 service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Received event network-vif-unplugged-e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-df49cea3-a8d3-40cb-8b32-3605cb146bf5 req-0eaaa617-62cf-4862-9837-edb8a603b306 service nova] Acquiring lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-df49cea3-a8d3-40cb-8b32-3605cb146bf5 req-0eaaa617-62cf-4862-9837-edb8a603b306 service nova] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-df49cea3-a8d3-40cb-8b32-3605cb146bf5 req-0eaaa617-62cf-4862-9837-edb8a603b306 service nova] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG nova.compute.manager [req-df49cea3-a8d3-40cb-8b32-3605cb146bf5 req-0eaaa617-62cf-4862-9837-edb8a603b306 service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] No waiting events found dispatching network-vif-unplugged-e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG nova.compute.manager [req-df49cea3-a8d3-40cb-8b32-3605cb146bf5 req-0eaaa617-62cf-4862-9837-edb8a603b306 service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Received event network-vif-unplugged-e9b64819-4e5d-4a42-aa9c-d460bb336094 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:15:58 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Instance destroyed successfully. Apr 24 00:15:58 user nova-compute[71205]: DEBUG nova.objects.instance [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lazy-loading 'resources' on Instance uuid 38c59ffc-494a-4d2b-a199-226a6e7cc683 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:14:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1191061004',display_name='tempest-AttachVolumeTestJSON-server-1191061004',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1191061004',id=13,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKcfTjP5WJcryNzsqlkun8gR9wahBjjV98bivT/lgH0hNMo4q0VTdNxQslRxtaeP2me2yW9f66D9TwOftMRDgzVFqTefGAIW0hAhjgdNDN4d4jy/uJaM5+9CQHsIpBa8tQ==',key_name='tempest-keypair-670582452',keypairs=,launch_index=0,launched_at=2023-04-24T00:14:12Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1d063c2bdc884fb8b826b9fb6fd97405',ramdisk_id='',reservation_id='r-0x8zxbn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-1425553791',owner_user_name='tempest-AttachVolumeTestJSON-1425553791-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:14:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='471d341199f0431a95ae54651c4f0780',uuid=38c59ffc-494a-4d2b-a199-226a6e7cc683,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "address": "fa:16:3e:66:d7:60", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.109", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape9b64819-4e", "ovs_interfaceid": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Converting VIF {"id": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "address": "fa:16:3e:66:d7:60", "network": {"id": "c29459b8-bcaa-41ae-94fa-d61344bae22f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2015616970-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.109", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d063c2bdc884fb8b826b9fb6fd97405", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape9b64819-4e", "ovs_interfaceid": "e9b64819-4e5d-4a42-aa9c-d460bb336094", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:66:d7:60,bridge_name='br-int',has_traffic_filtering=True,id=e9b64819-4e5d-4a42-aa9c-d460bb336094,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9b64819-4e') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:15:58 user nova-compute[71205]: DEBUG os_vif [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:d7:60,bridge_name='br-int',has_traffic_filtering=True,id=e9b64819-4e5d-4a42-aa9c-d460bb336094,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9b64819-4e') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:15:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9b64819-4e, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:15:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:15:59 user nova-compute[71205]: INFO os_vif [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:d7:60,bridge_name='br-int',has_traffic_filtering=True,id=e9b64819-4e5d-4a42-aa9c-d460bb336094,network=Network(c29459b8-bcaa-41ae-94fa-d61344bae22f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9b64819-4e') Apr 24 00:15:59 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Deleting instance files /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683_del Apr 24 00:15:59 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Deletion of /opt/stack/data/nova/instances/38c59ffc-494a-4d2b-a199-226a6e7cc683_del complete Apr 24 00:15:59 user nova-compute[71205]: INFO nova.compute.manager [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 24 00:15:59 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:15:59 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:15:59 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:15:59 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:16:00 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Took 0.92 seconds to deallocate network for instance. Apr 24 00:16:00 user nova-compute[71205]: DEBUG nova.compute.manager [req-b97ad9d6-eac5-4aef-bd6b-0f396082b1a8 req-24cae413-4bfe-4684-b26a-857211fff84a service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Received event network-vif-deleted-e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:00 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:16:00 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:16:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.323s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:00 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Deleted allocations for instance 38c59ffc-494a-4d2b-a199-226a6e7cc683 Apr 24 00:16:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f0e58108-e8d9-4fc4-8398-2071216bcbff tempest-AttachVolumeTestJSON-1425553791 tempest-AttachVolumeTestJSON-1425553791-project-member] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.078s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:00 user nova-compute[71205]: DEBUG nova.compute.manager [req-d52d3d01-258e-4d8f-a361-1e7b9b6cb4fc req-501353b1-6536-4f96-8592-f9368c3648c8 service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Received event network-vif-plugged-e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d52d3d01-258e-4d8f-a361-1e7b9b6cb4fc req-501353b1-6536-4f96-8592-f9368c3648c8 service nova] Acquiring lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d52d3d01-258e-4d8f-a361-1e7b9b6cb4fc req-501353b1-6536-4f96-8592-f9368c3648c8 service nova] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d52d3d01-258e-4d8f-a361-1e7b9b6cb4fc req-501353b1-6536-4f96-8592-f9368c3648c8 service nova] Lock "38c59ffc-494a-4d2b-a199-226a6e7cc683-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:00 user nova-compute[71205]: DEBUG nova.compute.manager [req-d52d3d01-258e-4d8f-a361-1e7b9b6cb4fc req-501353b1-6536-4f96-8592-f9368c3648c8 service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] No waiting events found dispatching network-vif-plugged-e9b64819-4e5d-4a42-aa9c-d460bb336094 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:16:00 user nova-compute[71205]: WARNING nova.compute.manager [req-d52d3d01-258e-4d8f-a361-1e7b9b6cb4fc req-501353b1-6536-4f96-8592-f9368c3648c8 service nova] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Received unexpected event network-vif-plugged-e9b64819-4e5d-4a42-aa9c-d460bb336094 for instance with vm_state deleted and task_state None. Apr 24 00:16:03 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:04 user nova-compute[71205]: DEBUG nova.compute.manager [req-39b4fe7e-96d8-4047-b09f-589b96a5d4a3 req-46aafa1d-25e6-4c83-8082-7d1950d5eb86 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Received event network-changed-44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:04 user nova-compute[71205]: DEBUG nova.compute.manager [req-39b4fe7e-96d8-4047-b09f-589b96a5d4a3 req-46aafa1d-25e6-4c83-8082-7d1950d5eb86 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Refreshing instance network info cache due to event network-changed-44f2910d-d36a-479f-8fa0-f16f9a406765. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:16:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-39b4fe7e-96d8-4047-b09f-589b96a5d4a3 req-46aafa1d-25e6-4c83-8082-7d1950d5eb86 service nova] Acquiring lock "refresh_cache-eb48c285-06f8-4d48-a550-c3fb3c05e93a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:16:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-39b4fe7e-96d8-4047-b09f-589b96a5d4a3 req-46aafa1d-25e6-4c83-8082-7d1950d5eb86 service nova] Acquired lock "refresh_cache-eb48c285-06f8-4d48-a550-c3fb3c05e93a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:16:04 user nova-compute[71205]: DEBUG nova.network.neutron [req-39b4fe7e-96d8-4047-b09f-589b96a5d4a3 req-46aafa1d-25e6-4c83-8082-7d1950d5eb86 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Refreshing network info cache for port 44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:16:04 user nova-compute[71205]: DEBUG nova.network.neutron [req-39b4fe7e-96d8-4047-b09f-589b96a5d4a3 req-46aafa1d-25e6-4c83-8082-7d1950d5eb86 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Updated VIF entry in instance network info cache for port 44f2910d-d36a-479f-8fa0-f16f9a406765. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:16:04 user nova-compute[71205]: DEBUG nova.network.neutron [req-39b4fe7e-96d8-4047-b09f-589b96a5d4a3 req-46aafa1d-25e6-4c83-8082-7d1950d5eb86 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Updating instance_info_cache with network_info: [{"id": "44f2910d-d36a-479f-8fa0-f16f9a406765", "address": "fa:16:3e:ed:01:64", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44f2910d-d3", "ovs_interfaceid": "44f2910d-d36a-479f-8fa0-f16f9a406765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:16:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-39b4fe7e-96d8-4047-b09f-589b96a5d4a3 req-46aafa1d-25e6-4c83-8082-7d1950d5eb86 service nova] Releasing lock "refresh_cache-eb48c285-06f8-4d48-a550-c3fb3c05e93a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:05 user nova-compute[71205]: INFO nova.compute.manager [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Terminating instance Apr 24 00:16:05 user nova-compute[71205]: DEBUG nova.compute.manager [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG nova.compute.manager [req-a0f09388-b55a-4569-a8bd-6896c10172cd req-447b7870-24cb-4ec0-901e-69c409e50ca4 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Received event network-vif-unplugged-44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-a0f09388-b55a-4569-a8bd-6896c10172cd req-447b7870-24cb-4ec0-901e-69c409e50ca4 service nova] Acquiring lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-a0f09388-b55a-4569-a8bd-6896c10172cd req-447b7870-24cb-4ec0-901e-69c409e50ca4 service nova] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-a0f09388-b55a-4569-a8bd-6896c10172cd req-447b7870-24cb-4ec0-901e-69c409e50ca4 service nova] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG nova.compute.manager [req-a0f09388-b55a-4569-a8bd-6896c10172cd req-447b7870-24cb-4ec0-901e-69c409e50ca4 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] No waiting events found dispatching network-vif-unplugged-44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:16:05 user nova-compute[71205]: DEBUG nova.compute.manager [req-a0f09388-b55a-4569-a8bd-6896c10172cd req-447b7870-24cb-4ec0-901e-69c409e50ca4 service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Received event network-vif-unplugged-44f2910d-d36a-479f-8fa0-f16f9a406765 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:16:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:06 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Instance destroyed successfully. Apr 24 00:16:06 user nova-compute[71205]: DEBUG nova.objects.instance [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lazy-loading 'resources' on Instance uuid eb48c285-06f8-4d48-a550-c3fb3c05e93a {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:16:06 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:14:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-593649378',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-593649378',id=14,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH3GEjjTKC06op3WsPivqO+l4BSt54OlT00thV38HKoIy/ZNPvfWSo0jJrcHAlGb/+rJGJfe0UfKW92qrg1FUtDiP9SzVcY4eMX4ApUmWqGlNNLLT473cBCHXS8s2TWXxg==',key_name='tempest-keypair-1058824653',keypairs=,launch_index=0,launched_at=2023-04-24T00:14:22Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='e2bf3154181247f8963be8cd31399851',ramdisk_id='',reservation_id='r-tpmp2uga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1947115496',owner_user_name='tempest-AttachVolumeShelveTestJSON-1947115496-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:14:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='539997e65f4f4ef7998a4386d19a5e9f',uuid=eb48c285-06f8-4d48-a550-c3fb3c05e93a,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "44f2910d-d36a-479f-8fa0-f16f9a406765", "address": "fa:16:3e:ed:01:64", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44f2910d-d3", "ovs_interfaceid": "44f2910d-d36a-479f-8fa0-f16f9a406765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:16:06 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Converting VIF {"id": "44f2910d-d36a-479f-8fa0-f16f9a406765", "address": "fa:16:3e:ed:01:64", "network": {"id": "e09e353d-220c-429c-9767-a11a59bbf99a", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2107564613-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e2bf3154181247f8963be8cd31399851", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44f2910d-d3", "ovs_interfaceid": "44f2910d-d36a-479f-8fa0-f16f9a406765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:16:06 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ed:01:64,bridge_name='br-int',has_traffic_filtering=True,id=44f2910d-d36a-479f-8fa0-f16f9a406765,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44f2910d-d3') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:16:06 user nova-compute[71205]: DEBUG os_vif [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ed:01:64,bridge_name='br-int',has_traffic_filtering=True,id=44f2910d-d36a-479f-8fa0-f16f9a406765,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44f2910d-d3') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:16:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44f2910d-d3, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:16:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:16:06 user nova-compute[71205]: INFO os_vif [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ed:01:64,bridge_name='br-int',has_traffic_filtering=True,id=44f2910d-d36a-479f-8fa0-f16f9a406765,network=Network(e09e353d-220c-429c-9767-a11a59bbf99a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44f2910d-d3') Apr 24 00:16:06 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Deleting instance files /opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a_del Apr 24 00:16:06 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Deletion of /opt/stack/data/nova/instances/eb48c285-06f8-4d48-a550-c3fb3c05e93a_del complete Apr 24 00:16:06 user nova-compute[71205]: INFO nova.compute.manager [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 24 00:16:06 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:16:06 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:16:06 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:16:07 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:16:07 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Took 0.96 seconds to deallocate network for instance. Apr 24 00:16:07 user nova-compute[71205]: DEBUG nova.compute.manager [req-a7e8f22b-160f-4513-b308-6a87960af367 req-e1d37797-72d6-4e4a-857c-63ccd48a533a service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Received event network-vif-deleted-44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:07 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:16:07 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:16:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.290s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:07 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Deleted allocations for instance eb48c285-06f8-4d48-a550-c3fb3c05e93a Apr 24 00:16:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-f24bf9cd-99b5-4ff2-b836-5148b31fc362 tempest-AttachVolumeShelveTestJSON-1947115496 tempest-AttachVolumeShelveTestJSON-1947115496-project-member] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.280s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:07 user nova-compute[71205]: DEBUG nova.compute.manager [req-609126cb-f080-46fb-a4f6-b48423510032 req-5727f6ec-a540-4043-b5ee-b4a9e8322e8c service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Received event network-vif-plugged-44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-609126cb-f080-46fb-a4f6-b48423510032 req-5727f6ec-a540-4043-b5ee-b4a9e8322e8c service nova] Acquiring lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-609126cb-f080-46fb-a4f6-b48423510032 req-5727f6ec-a540-4043-b5ee-b4a9e8322e8c service nova] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-609126cb-f080-46fb-a4f6-b48423510032 req-5727f6ec-a540-4043-b5ee-b4a9e8322e8c service nova] Lock "eb48c285-06f8-4d48-a550-c3fb3c05e93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:07 user nova-compute[71205]: DEBUG nova.compute.manager [req-609126cb-f080-46fb-a4f6-b48423510032 req-5727f6ec-a540-4043-b5ee-b4a9e8322e8c service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] No waiting events found dispatching network-vif-plugged-44f2910d-d36a-479f-8fa0-f16f9a406765 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:16:07 user nova-compute[71205]: WARNING nova.compute.manager [req-609126cb-f080-46fb-a4f6-b48423510032 req-5727f6ec-a540-4043-b5ee-b4a9e8322e8c service nova] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Received unexpected event network-vif-plugged-44f2910d-d36a-479f-8fa0-f16f9a406765 for instance with vm_state deleted and task_state None. Apr 24 00:16:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:12 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue --force-share --output=json" returned: 0 in 0.128s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk.rescue --force-share --output=json" returned: 0 in 0.136s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:16:13 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] VM Stopped (Lifecycle Event) Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG nova.compute.manager [None req-0eea8f85-7829-4f75-8f34-8a1a665d3541 None None] [instance: 38c59ffc-494a-4d2b-a199-226a6e7cc683] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json" returned: 0 in 0.157s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:16:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:16:15 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:16:15 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:16:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=8178MB free_disk=26.52322006225586GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ce19423d-a6ee-4506-9cd1-ec4803abdd86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ac38bbc2-2229-4497-b501-e9230ec59a32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance f1a14b79-7792-4962-bbe1-ec11e10e6948 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 43f004f3-9b3f-4388-88a8-8eb663ba36a3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance cf2c88d5-8347-4166-a037-158f29c32d1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 1c91af3a-b837-4ff0-a236-3483ffe5277d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 8 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=1536MB phys_disk=40GB used_disk=8GB total_vcpus=12 used_vcpus=8 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:16:15 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:16:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:16:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.737s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:16:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:16:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:16:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:16:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Didn't find any instances for network info cache update. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 24 00:16:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:16:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:16:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:16:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:16:21 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:16:21 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] VM Stopped (Lifecycle Event) Apr 24 00:16:21 user nova-compute[71205]: DEBUG nova.compute.manager [None req-00b3afc1-1b8b-44b7-ae18-85e6fc676256 None None] [instance: eb48c285-06f8-4d48-a550-c3fb3c05e93a] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:16:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:16:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:16:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:16:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:16:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:23 user nova-compute[71205]: INFO nova.compute.manager [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Terminating instance Apr 24 00:16:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG nova.compute.manager [req-ee5a7f01-be71-47e8-8ab3-bf316dab6c50 req-0b6cff13-d461-4f51-83f8-c722b9cdb508 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Received event network-vif-unplugged-d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ee5a7f01-be71-47e8-8ab3-bf316dab6c50 req-0b6cff13-d461-4f51-83f8-c722b9cdb508 service nova] Acquiring lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ee5a7f01-be71-47e8-8ab3-bf316dab6c50 req-0b6cff13-d461-4f51-83f8-c722b9cdb508 service nova] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ee5a7f01-be71-47e8-8ab3-bf316dab6c50 req-0b6cff13-d461-4f51-83f8-c722b9cdb508 service nova] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG nova.compute.manager [req-ee5a7f01-be71-47e8-8ab3-bf316dab6c50 req-0b6cff13-d461-4f51-83f8-c722b9cdb508 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] No waiting events found dispatching network-vif-unplugged-d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG nova.compute.manager [req-ee5a7f01-be71-47e8-8ab3-bf316dab6c50 req-0b6cff13-d461-4f51-83f8-c722b9cdb508 service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Received event network-vif-unplugged-d6f98d8d-f918-4aa5-abb0-a34e782f890a for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:16:23 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Instance destroyed successfully. Apr 24 00:16:23 user nova-compute[71205]: DEBUG nova.objects.instance [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lazy-loading 'resources' on Instance uuid dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-624022111',display_name='tempest-VolumesAdminNegativeTest-server-624022111',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-624022111',id=7,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLS1AjVipKW5cEt2rl35m09NWv1oXpefnHcFJOdwsnsIp5JPDa2HbS1qzIbePN1In3l/JpDLeRBWNBZrjWWOmQw0RrMjdQbQyn62Y/HmMBLXcgD+X7ygv99DAQ58UKkQKw==',key_name='tempest-keypair-324105185',keypairs=,launch_index=0,launched_at=2023-04-24T00:11:57Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='df0187dbb10d42da941645107df203f6',ramdisk_id='',reservation_id='r-tapv9pec',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-821594302',owner_user_name='tempest-VolumesAdminNegativeTest-821594302-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:11:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='640ec20e46a2422a8aabcc152e522e02',uuid=dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "address": "fa:16:3e:b2:2c:52", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.15", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6f98d8d-f9", "ovs_interfaceid": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Converting VIF {"id": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "address": "fa:16:3e:b2:2c:52", "network": {"id": "fc4a840b-9c4f-4c52-8a4e-411d66ac14e1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2047577326-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.15", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df0187dbb10d42da941645107df203f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6f98d8d-f9", "ovs_interfaceid": "d6f98d8d-f918-4aa5-abb0-a34e782f890a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b2:2c:52,bridge_name='br-int',has_traffic_filtering=True,id=d6f98d8d-f918-4aa5-abb0-a34e782f890a,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6f98d8d-f9') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG os_vif [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b2:2c:52,bridge_name='br-int',has_traffic_filtering=True,id=d6f98d8d-f918-4aa5-abb0-a34e782f890a,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6f98d8d-f9') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6f98d8d-f9, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:16:23 user nova-compute[71205]: INFO os_vif [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b2:2c:52,bridge_name='br-int',has_traffic_filtering=True,id=d6f98d8d-f918-4aa5-abb0-a34e782f890a,network=Network(fc4a840b-9c4f-4c52-8a4e-411d66ac14e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6f98d8d-f9') Apr 24 00:16:23 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Deleting instance files /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d_del Apr 24 00:16:23 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Deletion of /opt/stack/data/nova/instances/dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d_del complete Apr 24 00:16:24 user nova-compute[71205]: INFO nova.compute.manager [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 24 00:16:24 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:16:24 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:16:24 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:16:24 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:16:24 user nova-compute[71205]: DEBUG nova.compute.manager [req-c05d7e07-dae8-478a-852c-6263027b0490 req-2f4c6a96-5512-49c3-9b2c-40d6ab35670a service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Received event network-vif-deleted-d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:24 user nova-compute[71205]: INFO nova.compute.manager [req-c05d7e07-dae8-478a-852c-6263027b0490 req-2f4c6a96-5512-49c3-9b2c-40d6ab35670a service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Neutron deleted interface d6f98d8d-f918-4aa5-abb0-a34e782f890a; detaching it from the instance and deleting it from the info cache Apr 24 00:16:24 user nova-compute[71205]: DEBUG nova.network.neutron [req-c05d7e07-dae8-478a-852c-6263027b0490 req-2f4c6a96-5512-49c3-9b2c-40d6ab35670a service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:16:24 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Took 0.77 seconds to deallocate network for instance. Apr 24 00:16:24 user nova-compute[71205]: DEBUG nova.compute.manager [req-c05d7e07-dae8-478a-852c-6263027b0490 req-2f4c6a96-5512-49c3-9b2c-40d6ab35670a service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Detach interface failed, port_id=d6f98d8d-f918-4aa5-abb0-a34e782f890a, reason: Instance dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d could not be found. {{(pid=71205) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 24 00:16:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:25 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:16:25 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:16:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.269s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:25 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Deleted allocations for instance dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d Apr 24 00:16:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-92bec28d-c367-4f39-a2cc-0999762e8dd4 tempest-VolumesAdminNegativeTest-821594302 tempest-VolumesAdminNegativeTest-821594302-project-member] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.075s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:25 user nova-compute[71205]: DEBUG nova.compute.manager [req-784c58fb-b646-4745-b3b6-4ecceb6654cf req-6b82aca9-ff5d-47c6-9f99-bbee10f53b7d service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Received event network-vif-plugged-d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-784c58fb-b646-4745-b3b6-4ecceb6654cf req-6b82aca9-ff5d-47c6-9f99-bbee10f53b7d service nova] Acquiring lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-784c58fb-b646-4745-b3b6-4ecceb6654cf req-6b82aca9-ff5d-47c6-9f99-bbee10f53b7d service nova] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-784c58fb-b646-4745-b3b6-4ecceb6654cf req-6b82aca9-ff5d-47c6-9f99-bbee10f53b7d service nova] Lock "dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:25 user nova-compute[71205]: DEBUG nova.compute.manager [req-784c58fb-b646-4745-b3b6-4ecceb6654cf req-6b82aca9-ff5d-47c6-9f99-bbee10f53b7d service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] No waiting events found dispatching network-vif-plugged-d6f98d8d-f918-4aa5-abb0-a34e782f890a {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:16:25 user nova-compute[71205]: WARNING nova.compute.manager [req-784c58fb-b646-4745-b3b6-4ecceb6654cf req-6b82aca9-ff5d-47c6-9f99-bbee10f53b7d service nova] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Received unexpected event network-vif-plugged-d6f98d8d-f918-4aa5-abb0-a34e782f890a for instance with vm_state deleted and task_state None. Apr 24 00:16:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:34 user nova-compute[71205]: DEBUG nova.compute.manager [req-0c0d990b-49fb-464f-8f4e-07d8e7a22e6f req-85f91419-f7cd-4a67-86b2-eba32fd68a0d service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Received event network-changed-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:34 user nova-compute[71205]: DEBUG nova.compute.manager [req-0c0d990b-49fb-464f-8f4e-07d8e7a22e6f req-85f91419-f7cd-4a67-86b2-eba32fd68a0d service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Refreshing instance network info cache due to event network-changed-389e432f-0336-45fe-b2ff-4ba1ac63e0f3. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:16:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0c0d990b-49fb-464f-8f4e-07d8e7a22e6f req-85f91419-f7cd-4a67-86b2-eba32fd68a0d service nova] Acquiring lock "refresh_cache-43f004f3-9b3f-4388-88a8-8eb663ba36a3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:16:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0c0d990b-49fb-464f-8f4e-07d8e7a22e6f req-85f91419-f7cd-4a67-86b2-eba32fd68a0d service nova] Acquired lock "refresh_cache-43f004f3-9b3f-4388-88a8-8eb663ba36a3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:16:34 user nova-compute[71205]: DEBUG nova.network.neutron [req-0c0d990b-49fb-464f-8f4e-07d8e7a22e6f req-85f91419-f7cd-4a67-86b2-eba32fd68a0d service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Refreshing network info cache for port 389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:16:35 user nova-compute[71205]: DEBUG nova.network.neutron [req-0c0d990b-49fb-464f-8f4e-07d8e7a22e6f req-85f91419-f7cd-4a67-86b2-eba32fd68a0d service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Updated VIF entry in instance network info cache for port 389e432f-0336-45fe-b2ff-4ba1ac63e0f3. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:16:35 user nova-compute[71205]: DEBUG nova.network.neutron [req-0c0d990b-49fb-464f-8f4e-07d8e7a22e6f req-85f91419-f7cd-4a67-86b2-eba32fd68a0d service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Updating instance_info_cache with network_info: [{"id": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "address": "fa:16:3e:52:21:72", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.129", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap389e432f-03", "ovs_interfaceid": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:16:35 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0c0d990b-49fb-464f-8f4e-07d8e7a22e6f req-85f91419-f7cd-4a67-86b2-eba32fd68a0d service nova] Releasing lock "refresh_cache-43f004f3-9b3f-4388-88a8-8eb663ba36a3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:36 user nova-compute[71205]: INFO nova.compute.manager [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Terminating instance Apr 24 00:16:36 user nova-compute[71205]: DEBUG nova.compute.manager [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG nova.compute.manager [req-f5cfbbf4-92a7-4a36-83a8-2ae4f56d9ead req-41f23baa-407f-483d-82f2-4c35b37cad5c service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Received event network-vif-unplugged-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f5cfbbf4-92a7-4a36-83a8-2ae4f56d9ead req-41f23baa-407f-483d-82f2-4c35b37cad5c service nova] Acquiring lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f5cfbbf4-92a7-4a36-83a8-2ae4f56d9ead req-41f23baa-407f-483d-82f2-4c35b37cad5c service nova] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f5cfbbf4-92a7-4a36-83a8-2ae4f56d9ead req-41f23baa-407f-483d-82f2-4c35b37cad5c service nova] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG nova.compute.manager [req-f5cfbbf4-92a7-4a36-83a8-2ae4f56d9ead req-41f23baa-407f-483d-82f2-4c35b37cad5c service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] No waiting events found dispatching network-vif-unplugged-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG nova.compute.manager [req-f5cfbbf4-92a7-4a36-83a8-2ae4f56d9ead req-41f23baa-407f-483d-82f2-4c35b37cad5c service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Received event network-vif-unplugged-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG nova.compute.manager [req-f5cfbbf4-92a7-4a36-83a8-2ae4f56d9ead req-41f23baa-407f-483d-82f2-4c35b37cad5c service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Received event network-vif-plugged-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f5cfbbf4-92a7-4a36-83a8-2ae4f56d9ead req-41f23baa-407f-483d-82f2-4c35b37cad5c service nova] Acquiring lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f5cfbbf4-92a7-4a36-83a8-2ae4f56d9ead req-41f23baa-407f-483d-82f2-4c35b37cad5c service nova] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f5cfbbf4-92a7-4a36-83a8-2ae4f56d9ead req-41f23baa-407f-483d-82f2-4c35b37cad5c service nova] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:36 user nova-compute[71205]: DEBUG nova.compute.manager [req-f5cfbbf4-92a7-4a36-83a8-2ae4f56d9ead req-41f23baa-407f-483d-82f2-4c35b37cad5c service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] No waiting events found dispatching network-vif-plugged-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:16:36 user nova-compute[71205]: WARNING nova.compute.manager [req-f5cfbbf4-92a7-4a36-83a8-2ae4f56d9ead req-41f23baa-407f-483d-82f2-4c35b37cad5c service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Received unexpected event network-vif-plugged-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 for instance with vm_state active and task_state deleting. Apr 24 00:16:37 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Instance destroyed successfully. Apr 24 00:16:37 user nova-compute[71205]: DEBUG nova.objects.instance [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lazy-loading 'resources' on Instance uuid 43f004f3-9b3f-4388-88a8-8eb663ba36a3 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:16:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:14:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-482287756',display_name='tempest-AttachVolumeNegativeTest-server-482287756',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-482287756',id=16,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFZwHgbIk6e428TAmz7l4fwQyTPgmH1ghkhtoPbQ0yUpoo/9gV6SJaSCP7G0NDgEplyQD28R6/mr35tsYyIgUkJf2JTlCo+FavLRnp0EzEd9fZ37x1cfd9aCy0pyr8ntuw==',key_name='tempest-keypair-1687889936',keypairs=,launch_index=0,launched_at=2023-04-24T00:14:49Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='97d1e8a757a746329ea363af81a3c6b4',ramdisk_id='',reservation_id='r-jlbi39yi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-272859998',owner_user_name='tempest-AttachVolumeNegativeTest-272859998-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:14:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35edcadbe77c4f4fa8304216e7f61d4a',uuid=43f004f3-9b3f-4388-88a8-8eb663ba36a3,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "address": "fa:16:3e:52:21:72", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.129", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap389e432f-03", "ovs_interfaceid": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:16:37 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converting VIF {"id": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "address": "fa:16:3e:52:21:72", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.129", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap389e432f-03", "ovs_interfaceid": "389e432f-0336-45fe-b2ff-4ba1ac63e0f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:16:37 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:21:72,bridge_name='br-int',has_traffic_filtering=True,id=389e432f-0336-45fe-b2ff-4ba1ac63e0f3,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap389e432f-03') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:16:37 user nova-compute[71205]: DEBUG os_vif [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:21:72,bridge_name='br-int',has_traffic_filtering=True,id=389e432f-0336-45fe-b2ff-4ba1ac63e0f3,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap389e432f-03') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:16:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap389e432f-03, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:16:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:16:37 user nova-compute[71205]: INFO os_vif [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:21:72,bridge_name='br-int',has_traffic_filtering=True,id=389e432f-0336-45fe-b2ff-4ba1ac63e0f3,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap389e432f-03') Apr 24 00:16:37 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Deleting instance files /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3_del Apr 24 00:16:37 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Deletion of /opt/stack/data/nova/instances/43f004f3-9b3f-4388-88a8-8eb663ba36a3_del complete Apr 24 00:16:37 user nova-compute[71205]: INFO nova.compute.manager [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Took 1.03 seconds to destroy the instance on the hypervisor. Apr 24 00:16:37 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:16:37 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:16:37 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:16:38 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:16:38 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Took 0.97 seconds to deallocate network for instance. Apr 24 00:16:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:38 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:16:38 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:16:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.244s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:38 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Deleted allocations for instance 43f004f3-9b3f-4388-88a8-8eb663ba36a3 Apr 24 00:16:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-0d46dbbe-67fc-4c29-b661-fff97be797f2 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "43f004f3-9b3f-4388-88a8-8eb663ba36a3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.430s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:38 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:16:38 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] VM Stopped (Lifecycle Event) Apr 24 00:16:38 user nova-compute[71205]: DEBUG nova.compute.manager [None req-81a2f8aa-1002-4ccb-aa4c-bd8b57ad3fde None None] [instance: dae014b2-b56f-4e0f-8c1e-12b05d9c5b3d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:16:39 user nova-compute[71205]: DEBUG nova.compute.manager [req-a0699300-482a-4aa5-bbdb-db447e666aff req-efdd6e69-bf2e-4a90-8a0f-41250fbe0406 service nova] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Received event network-vif-deleted-389e432f-0336-45fe-b2ff-4ba1ac63e0f3 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:42 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:46 user nova-compute[71205]: DEBUG nova.compute.manager [req-6464f7c0-91de-4b20-bf4f-264d9cf476f3 req-6f05fc81-14ca-4b26-b180-517056e7ccf7 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received event network-changed-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:46 user nova-compute[71205]: DEBUG nova.compute.manager [req-6464f7c0-91de-4b20-bf4f-264d9cf476f3 req-6f05fc81-14ca-4b26-b180-517056e7ccf7 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Refreshing instance network info cache due to event network-changed-783a3713-64fb-48f3-b3ba-0312249006eb. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:16:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-6464f7c0-91de-4b20-bf4f-264d9cf476f3 req-6f05fc81-14ca-4b26-b180-517056e7ccf7 service nova] Acquiring lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:16:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-6464f7c0-91de-4b20-bf4f-264d9cf476f3 req-6f05fc81-14ca-4b26-b180-517056e7ccf7 service nova] Acquired lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:16:46 user nova-compute[71205]: DEBUG nova.network.neutron [req-6464f7c0-91de-4b20-bf4f-264d9cf476f3 req-6f05fc81-14ca-4b26-b180-517056e7ccf7 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Refreshing network info cache for port 783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:16:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:16:47 user nova-compute[71205]: DEBUG nova.network.neutron [req-6464f7c0-91de-4b20-bf4f-264d9cf476f3 req-6f05fc81-14ca-4b26-b180-517056e7ccf7 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Updated VIF entry in instance network info cache for port 783a3713-64fb-48f3-b3ba-0312249006eb. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:16:47 user nova-compute[71205]: DEBUG nova.network.neutron [req-6464f7c0-91de-4b20-bf4f-264d9cf476f3 req-6f05fc81-14ca-4b26-b180-517056e7ccf7 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Updating instance_info_cache with network_info: [{"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.38", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:16:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-6464f7c0-91de-4b20-bf4f-264d9cf476f3 req-6f05fc81-14ca-4b26-b180-517056e7ccf7 service nova] Releasing lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:16:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:52 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:16:52 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] VM Stopped (Lifecycle Event) Apr 24 00:16:52 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d3830136-26d9-4f63-a65e-ee31e936acaa None None] [instance: 43f004f3-9b3f-4388-88a8-8eb663ba36a3] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:16:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "f1a14b79-7792-4962-bbe1-ec11e10e6948" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:56 user nova-compute[71205]: INFO nova.compute.manager [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Terminating instance Apr 24 00:16:56 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:56 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Instance destroyed successfully. Apr 24 00:16:56 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lazy-loading 'resources' on Instance uuid f1a14b79-7792-4962-bbe1-ec11e10e6948 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:12:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-265518045',display_name='tempest-ServerRescueNegativeTestJSON-server-265518045',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-265518045',id=10,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-24T00:14:07Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='5cff0cbf3a5c4a4aadb3399a31adff0d',ramdisk_id='',reservation_id='r-xi2sy34x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-487575741',owner_user_name='tempest-ServerRescueNegativeTestJSON-487575741-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:14:07Z,user_data=None,user_id='514ecffec8034d60ae3c00ecd1ef5c8b',uuid=f1a14b79-7792-4962-bbe1-ec11e10e6948,vcpu_model=,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converting VIF {"id": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "address": "fa:16:3e:33:34:0d", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbd4aa76-48", "ovs_interfaceid": "fbd4aa76-4861-41fe-a0dc-5dee747b2517", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:33:34:0d,bridge_name='br-int',has_traffic_filtering=True,id=fbd4aa76-4861-41fe-a0dc-5dee747b2517,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbd4aa76-48') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG os_vif [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:34:0d,bridge_name='br-int',has_traffic_filtering=True,id=fbd4aa76-4861-41fe-a0dc-5dee747b2517,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbd4aa76-48') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG nova.compute.manager [req-2dd766f6-9524-49fb-a446-e9b5c2ff4e3b req-8315e96a-e887-48e9-a5f3-34292d09a468 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received event network-vif-unplugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-2dd766f6-9524-49fb-a446-e9b5c2ff4e3b req-8315e96a-e887-48e9-a5f3-34292d09a468 service nova] Acquiring lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-2dd766f6-9524-49fb-a446-e9b5c2ff4e3b req-8315e96a-e887-48e9-a5f3-34292d09a468 service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-2dd766f6-9524-49fb-a446-e9b5c2ff4e3b req-8315e96a-e887-48e9-a5f3-34292d09a468 service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG nova.compute.manager [req-2dd766f6-9524-49fb-a446-e9b5c2ff4e3b req-8315e96a-e887-48e9-a5f3-34292d09a468 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] No waiting events found dispatching network-vif-unplugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG nova.compute.manager [req-2dd766f6-9524-49fb-a446-e9b5c2ff4e3b req-8315e96a-e887-48e9-a5f3-34292d09a468 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received event network-vif-unplugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd4aa76-48, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:56 user nova-compute[71205]: INFO os_vif [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:34:0d,bridge_name='br-int',has_traffic_filtering=True,id=fbd4aa76-4861-41fe-a0dc-5dee747b2517,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbd4aa76-48') Apr 24 00:16:56 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Deleting instance files /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948_del Apr 24 00:16:56 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Deletion of /opt/stack/data/nova/instances/f1a14b79-7792-4962-bbe1-ec11e10e6948_del complete Apr 24 00:16:57 user nova-compute[71205]: INFO nova.compute.manager [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Took 0.68 seconds to destroy the instance on the hypervisor. Apr 24 00:16:57 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:16:57 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:16:57 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:16:57 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:16:57 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Took 0.95 seconds to deallocate network for instance. Apr 24 00:16:57 user nova-compute[71205]: DEBUG nova.compute.manager [req-e3b7528f-0a77-463a-91a2-8d37a8781ddf req-24793757-8606-4ca4-9a43-348ad7cf5e23 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received event network-vif-deleted-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:57 user nova-compute[71205]: INFO nova.compute.manager [req-e3b7528f-0a77-463a-91a2-8d37a8781ddf req-24793757-8606-4ca4-9a43-348ad7cf5e23 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Neutron deleted interface fbd4aa76-4861-41fe-a0dc-5dee747b2517; detaching it from the instance and deleting it from the info cache Apr 24 00:16:57 user nova-compute[71205]: DEBUG nova.network.neutron [req-e3b7528f-0a77-463a-91a2-8d37a8781ddf req-24793757-8606-4ca4-9a43-348ad7cf5e23 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:16:58 user nova-compute[71205]: DEBUG nova.compute.manager [req-e3b7528f-0a77-463a-91a2-8d37a8781ddf req-24793757-8606-4ca4-9a43-348ad7cf5e23 service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Detach interface failed, port_id=fbd4aa76-4861-41fe-a0dc-5dee747b2517, reason: Instance f1a14b79-7792-4962-bbe1-ec11e10e6948 could not be found. {{(pid=71205) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 24 00:16:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:58 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:16:58 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:16:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.252s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:58 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Deleted allocations for instance f1a14b79-7792-4962-bbe1-ec11e10e6948 Apr 24 00:16:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6a5fc73c-fb6e-4cec-9bb0-acf2eda73fe5 tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.070s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:16:58 user nova-compute[71205]: DEBUG nova.compute.manager [req-8f93deec-9253-475b-942d-a0e300656463 req-a40e2293-4c72-4df6-90c2-8068d3bca09f service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received event network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:16:59 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-8f93deec-9253-475b-942d-a0e300656463 req-a40e2293-4c72-4df6-90c2-8068d3bca09f service nova] Acquiring lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:16:59 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-8f93deec-9253-475b-942d-a0e300656463 req-a40e2293-4c72-4df6-90c2-8068d3bca09f service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.003s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:16:59 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-8f93deec-9253-475b-942d-a0e300656463 req-a40e2293-4c72-4df6-90c2-8068d3bca09f service nova] Lock "f1a14b79-7792-4962-bbe1-ec11e10e6948-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:16:59 user nova-compute[71205]: DEBUG nova.compute.manager [req-8f93deec-9253-475b-942d-a0e300656463 req-a40e2293-4c72-4df6-90c2-8068d3bca09f service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] No waiting events found dispatching network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:16:59 user nova-compute[71205]: WARNING nova.compute.manager [req-8f93deec-9253-475b-942d-a0e300656463 req-a40e2293-4c72-4df6-90c2-8068d3bca09f service nova] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Received unexpected event network-vif-plugged-fbd4aa76-4861-41fe-a0dc-5dee747b2517 for instance with vm_state deleted and task_state None. Apr 24 00:17:00 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:01 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:03 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:07 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Cleaning up deleted instances {{(pid=71205) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 24 00:17:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] There are 0 instances to clean {{(pid=71205) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 24 00:17:08 user nova-compute[71205]: DEBUG nova.compute.manager [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:17:08 user nova-compute[71205]: INFO nova.compute.manager [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] instance snapshotting Apr 24 00:17:08 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Beginning live snapshot process Apr 24 00:17:08 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json -f qcow2 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:09 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json -f qcow2" returned: 0 in 0.140s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:09 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json -f qcow2 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:09 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json -f qcow2" returned: 0 in 0.139s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:09 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:09 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.141s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:09 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpgt8oz5rb/9b0a82b0f6c64a4dbde7a059053c5176.delta 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:09 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpgt8oz5rb/9b0a82b0f6c64a4dbde7a059053c5176.delta 1073741824" returned: 0 in 0.048s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:09 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Quiescing instance not available: QEMU guest agent is not enabled. Apr 24 00:17:10 user nova-compute[71205]: DEBUG nova.virt.libvirt.guest [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71205) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 24 00:17:10 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Cleaning up deleted instances with incomplete migration {{(pid=71205) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 24 00:17:10 user nova-compute[71205]: DEBUG nova.virt.libvirt.guest [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71205) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 24 00:17:10 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 24 00:17:10 user nova-compute[71205]: DEBUG nova.privsep.utils [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71205) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 24 00:17:10 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpgt8oz5rb/9b0a82b0f6c64a4dbde7a059053c5176.delta /opt/stack/data/nova/instances/snapshots/tmpgt8oz5rb/9b0a82b0f6c64a4dbde7a059053c5176 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:10 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpgt8oz5rb/9b0a82b0f6c64a4dbde7a059053c5176.delta /opt/stack/data/nova/instances/snapshots/tmpgt8oz5rb/9b0a82b0f6c64a4dbde7a059053c5176" returned: 0 in 0.251s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:10 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Snapshot extracted, beginning image upload Apr 24 00:17:11 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:17:11 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] VM Stopped (Lifecycle Event) Apr 24 00:17:11 user nova-compute[71205]: DEBUG nova.compute.manager [None req-962713a7-e123-4cb2-b2df-1ad69f86c347 None None] [instance: f1a14b79-7792-4962-bbe1-ec11e10e6948] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:17:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:13 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:13 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:13 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:13 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:17:13 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Snapshot image upload complete Apr 24 00:17:13 user nova-compute[71205]: INFO nova.compute.manager [None req-cf5e3676-2394-4f05-81d5-c6143eb1eb24 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Took 4.68 seconds to snapshot the instance on the hypervisor. Apr 24 00:17:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:13 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:14 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:15 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:17:15 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:17:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=8535MB free_disk=26.524532318115234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:17:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ce19423d-a6ee-4506-9cd1-ec4803abdd86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:17:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:17:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ac38bbc2-2229-4497-b501-e9230ec59a32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:17:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance cf2c88d5-8347-4166-a037-158f29c32d1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:17:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 1c91af3a-b837-4ff0-a236-3483ffe5277d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:17:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:17:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:17:15 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:17:15 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:17:15 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:17:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.341s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:16 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:16 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:16 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:17:16 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:17:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-ce19423d-a6ee-4506-9cd1-ec4803abdd86" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:17:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-ce19423d-a6ee-4506-9cd1-ec4803abdd86" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:17:16 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:17:16 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lazy-loading 'info_cache' on Instance uuid ce19423d-a6ee-4506-9cd1-ec4803abdd86 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:17:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:17 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Updating instance_info_cache with network_info: [{"id": "f6c98734-17ee-42d0-9372-fd76526b0b27", "address": "fa:16:3e:72:c7:21", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6c98734-17", "ovs_interfaceid": "f6c98734-17ee-42d0-9372-fd76526b0b27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:17:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-ce19423d-a6ee-4506-9cd1-ec4803abdd86" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:17:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:17:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:17:18 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:18 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:18 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:26 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:26 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:17:31 user nova-compute[71205]: INFO nova.compute.claims [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Claim successful on node user Apr 24 00:17:31 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Refreshing inventories for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Updating ProviderTree inventory for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Updating inventory in ProviderTree for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Refreshing aggregate associations for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4, aggregates: None {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Refreshing trait associations for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4, traits: HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_DEVICE_TAGGING {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.425s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:17:31 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:17:32 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:17:32 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:17:32 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Creating image(s) Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "/opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "/opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "/opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.141s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.134s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG nova.policy [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '35edcadbe77c4f4fa8304216e7f61d4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '97d1e8a757a746329ea363af81a3c6b4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk 1073741824" returned: 0 in 0.062s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.201s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.145s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Checking if we can resize image /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Cannot resize image /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG nova.objects.instance [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lazy-loading 'migration_context' on Instance uuid a2e58fac-ff2d-47e5-866d-de1f2b741cb3 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Ensure instance console log exists: /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:33 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Successfully created port: bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:17:35 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Successfully updated port: bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:17:35 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "refresh_cache-a2e58fac-ff2d-47e5-866d-de1f2b741cb3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:17:35 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquired lock "refresh_cache-a2e58fac-ff2d-47e5-866d-de1f2b741cb3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:17:35 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:17:35 user nova-compute[71205]: DEBUG nova.compute.manager [req-934b6c81-b90b-4912-ae9b-0e1d73ead335 req-bd79c87d-3a15-4333-ad75-079f717a2a3d service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Received event network-changed-bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:17:35 user nova-compute[71205]: DEBUG nova.compute.manager [req-934b6c81-b90b-4912-ae9b-0e1d73ead335 req-bd79c87d-3a15-4333-ad75-079f717a2a3d service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Refreshing instance network info cache due to event network-changed-bbf311b8-ca79-433b-9979-dd2ee6102146. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:17:35 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-934b6c81-b90b-4912-ae9b-0e1d73ead335 req-bd79c87d-3a15-4333-ad75-079f717a2a3d service nova] Acquiring lock "refresh_cache-a2e58fac-ff2d-47e5-866d-de1f2b741cb3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:17:35 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:17:35 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.network.neutron [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Updating instance_info_cache with network_info: [{"id": "bbf311b8-ca79-433b-9979-dd2ee6102146", "address": "fa:16:3e:90:73:83", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbf311b8-ca", "ovs_interfaceid": "bbf311b8-ca79-433b-9979-dd2ee6102146", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Releasing lock "refresh_cache-a2e58fac-ff2d-47e5-866d-de1f2b741cb3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Instance network_info: |[{"id": "bbf311b8-ca79-433b-9979-dd2ee6102146", "address": "fa:16:3e:90:73:83", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbf311b8-ca", "ovs_interfaceid": "bbf311b8-ca79-433b-9979-dd2ee6102146", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-934b6c81-b90b-4912-ae9b-0e1d73ead335 req-bd79c87d-3a15-4333-ad75-079f717a2a3d service nova] Acquired lock "refresh_cache-a2e58fac-ff2d-47e5-866d-de1f2b741cb3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.network.neutron [req-934b6c81-b90b-4912-ae9b-0e1d73ead335 req-bd79c87d-3a15-4333-ad75-079f717a2a3d service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Refreshing network info cache for port bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Start _get_guest_xml network_info=[{"id": "bbf311b8-ca79-433b-9979-dd2ee6102146", "address": "fa:16:3e:90:73:83", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbf311b8-ca", "ovs_interfaceid": "bbf311b8-ca79-433b-9979-dd2ee6102146", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:17:36 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:17:36 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:17:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-313901605',display_name='tempest-AttachVolumeNegativeTest-server-313901605',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-313901605',id=19,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL9ANefNai0/rsSQaIoE6MwpNZsPCDJ1hAlE0Sqe0cS0k9aiWxLvZOvWrAlWMxkgj6Ru1wbGkR0RjHJCL/oloMLQjvQ7osVNQYPYNqwz8q0ZbFbyL90CnzPAEEVrQST+9A==',key_name='tempest-keypair-669922603',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='97d1e8a757a746329ea363af81a3c6b4',ramdisk_id='',reservation_id='r-4yh2fdof',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-272859998',owner_user_name='tempest-AttachVolumeNegativeTest-272859998-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:17:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35edcadbe77c4f4fa8304216e7f61d4a',uuid=a2e58fac-ff2d-47e5-866d-de1f2b741cb3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bbf311b8-ca79-433b-9979-dd2ee6102146", "address": "fa:16:3e:90:73:83", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbf311b8-ca", "ovs_interfaceid": "bbf311b8-ca79-433b-9979-dd2ee6102146", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converting VIF {"id": "bbf311b8-ca79-433b-9979-dd2ee6102146", "address": "fa:16:3e:90:73:83", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbf311b8-ca", "ovs_interfaceid": "bbf311b8-ca79-433b-9979-dd2ee6102146", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:73:83,bridge_name='br-int',has_traffic_filtering=True,id=bbf311b8-ca79-433b-9979-dd2ee6102146,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbbf311b8-ca') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.objects.instance [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lazy-loading 'pci_devices' on Instance uuid a2e58fac-ff2d-47e5-866d-de1f2b741cb3 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] End _get_guest_xml xml= Apr 24 00:17:36 user nova-compute[71205]: a2e58fac-ff2d-47e5-866d-de1f2b741cb3 Apr 24 00:17:36 user nova-compute[71205]: instance-00000013 Apr 24 00:17:36 user nova-compute[71205]: 131072 Apr 24 00:17:36 user nova-compute[71205]: 1 Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: tempest-AttachVolumeNegativeTest-server-313901605 Apr 24 00:17:36 user nova-compute[71205]: 2023-04-24 00:17:36 Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: 128 Apr 24 00:17:36 user nova-compute[71205]: 1 Apr 24 00:17:36 user nova-compute[71205]: 0 Apr 24 00:17:36 user nova-compute[71205]: 0 Apr 24 00:17:36 user nova-compute[71205]: 1 Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: tempest-AttachVolumeNegativeTest-272859998-project-member Apr 24 00:17:36 user nova-compute[71205]: tempest-AttachVolumeNegativeTest-272859998 Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: OpenStack Foundation Apr 24 00:17:36 user nova-compute[71205]: OpenStack Nova Apr 24 00:17:36 user nova-compute[71205]: 0.0.0 Apr 24 00:17:36 user nova-compute[71205]: a2e58fac-ff2d-47e5-866d-de1f2b741cb3 Apr 24 00:17:36 user nova-compute[71205]: a2e58fac-ff2d-47e5-866d-de1f2b741cb3 Apr 24 00:17:36 user nova-compute[71205]: Virtual Machine Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: hvm Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Nehalem Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: /dev/urandom Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: Apr 24 00:17:36 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:17:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-313901605',display_name='tempest-AttachVolumeNegativeTest-server-313901605',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-313901605',id=19,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL9ANefNai0/rsSQaIoE6MwpNZsPCDJ1hAlE0Sqe0cS0k9aiWxLvZOvWrAlWMxkgj6Ru1wbGkR0RjHJCL/oloMLQjvQ7osVNQYPYNqwz8q0ZbFbyL90CnzPAEEVrQST+9A==',key_name='tempest-keypair-669922603',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='97d1e8a757a746329ea363af81a3c6b4',ramdisk_id='',reservation_id='r-4yh2fdof',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-272859998',owner_user_name='tempest-AttachVolumeNegativeTest-272859998-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:17:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35edcadbe77c4f4fa8304216e7f61d4a',uuid=a2e58fac-ff2d-47e5-866d-de1f2b741cb3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bbf311b8-ca79-433b-9979-dd2ee6102146", "address": "fa:16:3e:90:73:83", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbf311b8-ca", "ovs_interfaceid": "bbf311b8-ca79-433b-9979-dd2ee6102146", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converting VIF {"id": "bbf311b8-ca79-433b-9979-dd2ee6102146", "address": "fa:16:3e:90:73:83", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbf311b8-ca", "ovs_interfaceid": "bbf311b8-ca79-433b-9979-dd2ee6102146", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:73:83,bridge_name='br-int',has_traffic_filtering=True,id=bbf311b8-ca79-433b-9979-dd2ee6102146,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbbf311b8-ca') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG os_vif [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:73:83,bridge_name='br-int',has_traffic_filtering=True,id=bbf311b8-ca79-433b-9979-dd2ee6102146,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbbf311b8-ca') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbbf311b8-ca, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbbf311b8-ca, col_values=(('external_ids', {'iface-id': 'bbf311b8-ca79-433b-9979-dd2ee6102146', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:90:73:83', 'vm-uuid': 'a2e58fac-ff2d-47e5-866d-de1f2b741cb3'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:36 user nova-compute[71205]: INFO os_vif [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:73:83,bridge_name='br-int',has_traffic_filtering=True,id=bbf311b8-ca79-433b-9979-dd2ee6102146,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbbf311b8-ca') Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] No VIF found with MAC fa:16:3e:90:73:83, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.network.neutron [req-934b6c81-b90b-4912-ae9b-0e1d73ead335 req-bd79c87d-3a15-4333-ad75-079f717a2a3d service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Updated VIF entry in instance network info cache for port bbf311b8-ca79-433b-9979-dd2ee6102146. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG nova.network.neutron [req-934b6c81-b90b-4912-ae9b-0e1d73ead335 req-bd79c87d-3a15-4333-ad75-079f717a2a3d service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Updating instance_info_cache with network_info: [{"id": "bbf311b8-ca79-433b-9979-dd2ee6102146", "address": "fa:16:3e:90:73:83", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbf311b8-ca", "ovs_interfaceid": "bbf311b8-ca79-433b-9979-dd2ee6102146", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:17:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-934b6c81-b90b-4912-ae9b-0e1d73ead335 req-bd79c87d-3a15-4333-ad75-079f717a2a3d service nova] Releasing lock "refresh_cache-a2e58fac-ff2d-47e5-866d-de1f2b741cb3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:17:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:37 user nova-compute[71205]: DEBUG nova.compute.manager [req-ef873aa5-fff9-4b93-aa22-c6b424a7dd81 req-8159c682-cd75-44c8-9c50-6159bbdedffe service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Received event network-vif-plugged-bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:17:37 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ef873aa5-fff9-4b93-aa22-c6b424a7dd81 req-8159c682-cd75-44c8-9c50-6159bbdedffe service nova] Acquiring lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:37 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ef873aa5-fff9-4b93-aa22-c6b424a7dd81 req-8159c682-cd75-44c8-9c50-6159bbdedffe service nova] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:37 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ef873aa5-fff9-4b93-aa22-c6b424a7dd81 req-8159c682-cd75-44c8-9c50-6159bbdedffe service nova] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:37 user nova-compute[71205]: DEBUG nova.compute.manager [req-ef873aa5-fff9-4b93-aa22-c6b424a7dd81 req-8159c682-cd75-44c8-9c50-6159bbdedffe service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] No waiting events found dispatching network-vif-plugged-bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:17:37 user nova-compute[71205]: WARNING nova.compute.manager [req-ef873aa5-fff9-4b93-aa22-c6b424a7dd81 req-8159c682-cd75-44c8-9c50-6159bbdedffe service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Received unexpected event network-vif-plugged-bbf311b8-ca79-433b-9979-dd2ee6102146 for instance with vm_state building and task_state spawning. Apr 24 00:17:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:17:39 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] VM Resumed (Lifecycle Event) Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:17:39 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Instance spawned successfully. Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:17:39 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:17:39 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] VM Started (Lifecycle Event) Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:17:39 user nova-compute[71205]: INFO nova.compute.manager [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Took 7.67 seconds to spawn the instance on the hypervisor. Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:17:39 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:17:39 user nova-compute[71205]: INFO nova.compute.manager [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Took 8.51 seconds to build instance. Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.compute.manager [req-0aeb867e-97e6-4f79-88cd-45c06ab8d800 req-05ce2afa-7cd1-44d3-8bb4-30d7a145383b service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Received event network-vif-plugged-bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0aeb867e-97e6-4f79-88cd-45c06ab8d800 req-05ce2afa-7cd1-44d3-8bb4-30d7a145383b service nova] Acquiring lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0aeb867e-97e6-4f79-88cd-45c06ab8d800 req-05ce2afa-7cd1-44d3-8bb4-30d7a145383b service nova] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0aeb867e-97e6-4f79-88cd-45c06ab8d800 req-05ce2afa-7cd1-44d3-8bb4-30d7a145383b service nova] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:39 user nova-compute[71205]: DEBUG nova.compute.manager [req-0aeb867e-97e6-4f79-88cd-45c06ab8d800 req-05ce2afa-7cd1-44d3-8bb4-30d7a145383b service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] No waiting events found dispatching network-vif-plugged-bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:17:39 user nova-compute[71205]: WARNING nova.compute.manager [req-0aeb867e-97e6-4f79-88cd-45c06ab8d800 req-05ce2afa-7cd1-44d3-8bb4-30d7a145383b service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Received unexpected event network-vif-plugged-bbf311b8-ca79-433b-9979-dd2ee6102146 for instance with vm_state active and task_state None. Apr 24 00:17:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-d7904e52-2962-4d04-8b78-12b41418ec86 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.619s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Acquiring lock "1728a086-3a73-4157-8c2f-3606820448a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "1728a086-3a73-4157-8c2f-3606820448a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG nova.compute.manager [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:17:46 user nova-compute[71205]: INFO nova.compute.claims [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Claim successful on node user Apr 24 00:17:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.501s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG nova.compute.manager [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG nova.compute.manager [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG nova.network.neutron [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:17:46 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:17:46 user nova-compute[71205]: DEBUG nova.compute.manager [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG nova.policy [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4192d379850d40c5b684d8835548acd5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9011aa88afdb40f4be4c2e9846fa72dd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG nova.compute.manager [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:17:46 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Creating image(s) Apr 24 00:17:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Acquiring lock "/opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "/opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "/opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:46 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.146s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.144s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk 1073741824" returned: 0 in 0.056s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.207s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "ac38bbc2-2229-4497-b501-e9230ec59a32" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.003s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:47 user nova-compute[71205]: INFO nova.compute.manager [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Terminating instance Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.compute.manager [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.159s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Checking if we can resize image /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk --force-share --output=json" returned: 0 in 0.166s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Cannot resize image /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.objects.instance [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lazy-loading 'migration_context' on Instance uuid 1728a086-3a73-4157-8c2f-3606820448a9 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Ensure instance console log exists: /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.network.neutron [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Successfully created port: 1d88c88b-4db7-4078-b68a-941a4a4e7920 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.compute.manager [req-fe440427-21b0-4a6c-a719-eba26602f40a req-d7ad1241-d088-46cc-9585-68624eba8515 service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Received event network-vif-unplugged-b716ea04-c5e7-43fc-9f20-5ecb011d6385 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-fe440427-21b0-4a6c-a719-eba26602f40a req-d7ad1241-d088-46cc-9585-68624eba8515 service nova] Acquiring lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-fe440427-21b0-4a6c-a719-eba26602f40a req-d7ad1241-d088-46cc-9585-68624eba8515 service nova] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-fe440427-21b0-4a6c-a719-eba26602f40a req-d7ad1241-d088-46cc-9585-68624eba8515 service nova] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.compute.manager [req-fe440427-21b0-4a6c-a719-eba26602f40a req-d7ad1241-d088-46cc-9585-68624eba8515 service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] No waiting events found dispatching network-vif-unplugged-b716ea04-c5e7-43fc-9f20-5ecb011d6385 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.compute.manager [req-fe440427-21b0-4a6c-a719-eba26602f40a req-d7ad1241-d088-46cc-9585-68624eba8515 service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Received event network-vif-unplugged-b716ea04-c5e7-43fc-9f20-5ecb011d6385 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:17:47 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Instance destroyed successfully. Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.objects.instance [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lazy-loading 'resources' on Instance uuid ac38bbc2-2229-4497-b501-e9230ec59a32 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:12:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-891210301',display_name='tempest-ServerRescueNegativeTestJSON-server-891210301',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-891210301',id=9,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-24T00:12:19Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='5cff0cbf3a5c4a4aadb3399a31adff0d',ramdisk_id='',reservation_id='r-uhn0foxp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-487575741',owner_user_name='tempest-ServerRescueNegativeTestJSON-487575741-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:12:20Z,user_data=None,user_id='514ecffec8034d60ae3c00ecd1ef5c8b',uuid=ac38bbc2-2229-4497-b501-e9230ec59a32,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "address": "fa:16:3e:02:81:d9", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb716ea04-c5", "ovs_interfaceid": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converting VIF {"id": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "address": "fa:16:3e:02:81:d9", "network": {"id": "ba4ccc60-ffcd-4638-9c93-6d4d59641961", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1591321320-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5cff0cbf3a5c4a4aadb3399a31adff0d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb716ea04-c5", "ovs_interfaceid": "b716ea04-c5e7-43fc-9f20-5ecb011d6385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:81:d9,bridge_name='br-int',has_traffic_filtering=True,id=b716ea04-c5e7-43fc-9f20-5ecb011d6385,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb716ea04-c5') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG os_vif [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:81:d9,bridge_name='br-int',has_traffic_filtering=True,id=b716ea04-c5e7-43fc-9f20-5ecb011d6385,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb716ea04-c5') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb716ea04-c5, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:17:47 user nova-compute[71205]: INFO os_vif [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:81:d9,bridge_name='br-int',has_traffic_filtering=True,id=b716ea04-c5e7-43fc-9f20-5ecb011d6385,network=Network(ba4ccc60-ffcd-4638-9c93-6d4d59641961),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb716ea04-c5') Apr 24 00:17:47 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Deleting instance files /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32_del Apr 24 00:17:47 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Deletion of /opt/stack/data/nova/instances/ac38bbc2-2229-4497-b501-e9230ec59a32_del complete Apr 24 00:17:48 user nova-compute[71205]: INFO nova.compute.manager [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Took 0.73 seconds to destroy the instance on the hypervisor. Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.network.neutron [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Successfully updated port: 1d88c88b-4db7-4078-b68a-941a4a4e7920 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Acquiring lock "refresh_cache-1728a086-3a73-4157-8c2f-3606820448a9" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Acquired lock "refresh_cache-1728a086-3a73-4157-8c2f-3606820448a9" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.network.neutron [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.compute.manager [req-1a9058fd-f93a-4aa8-bffc-502c72f76968 req-08d743de-47f3-4459-8303-9e467138b1f0 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Received event network-changed-1d88c88b-4db7-4078-b68a-941a4a4e7920 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.compute.manager [req-1a9058fd-f93a-4aa8-bffc-502c72f76968 req-08d743de-47f3-4459-8303-9e467138b1f0 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Refreshing instance network info cache due to event network-changed-1d88c88b-4db7-4078-b68a-941a4a4e7920. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-1a9058fd-f93a-4aa8-bffc-502c72f76968 req-08d743de-47f3-4459-8303-9e467138b1f0 service nova] Acquiring lock "refresh_cache-1728a086-3a73-4157-8c2f-3606820448a9" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.network.neutron [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:17:48 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Took 0.60 seconds to deallocate network for instance. Apr 24 00:17:48 user nova-compute[71205]: WARNING nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] While synchronizing instance power states, found 7 instances in the database and 5 instances on the hypervisor. Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Triggering sync for uuid ce19423d-a6ee-4506-9cd1-ec4803abdd86 {{(pid=71205) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Triggering sync for uuid c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b {{(pid=71205) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Triggering sync for uuid ac38bbc2-2229-4497-b501-e9230ec59a32 {{(pid=71205) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Triggering sync for uuid cf2c88d5-8347-4166-a037-158f29c32d1a {{(pid=71205) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Triggering sync for uuid 1c91af3a-b837-4ff0-a236-3483ffe5277d {{(pid=71205) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Triggering sync for uuid a2e58fac-ff2d-47e5-866d-de1f2b741cb3 {{(pid=71205) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Triggering sync for uuid 1728a086-3a73-4157-8c2f-3606820448a9 {{(pid=71205) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "ac38bbc2-2229-4497-b501-e9230ec59a32" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "cf2c88d5-8347-4166-a037-158f29c32d1a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "1c91af3a-b837-4ff0-a236-3483ffe5277d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "1728a086-3a73-4157-8c2f-3606820448a9" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.058s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.062s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.078s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.081s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.085s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.network.neutron [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Updating instance_info_cache with network_info: [{"id": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "address": "fa:16:3e:14:05:1e", "network": {"id": "13deae10-040e-47ec-a868-0cd1b10e09e1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-566937497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9011aa88afdb40f4be4c2e9846fa72dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d88c88b-4d", "ovs_interfaceid": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Releasing lock "refresh_cache-1728a086-3a73-4157-8c2f-3606820448a9" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Instance network_info: |[{"id": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "address": "fa:16:3e:14:05:1e", "network": {"id": "13deae10-040e-47ec-a868-0cd1b10e09e1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-566937497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9011aa88afdb40f4be4c2e9846fa72dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d88c88b-4d", "ovs_interfaceid": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-1a9058fd-f93a-4aa8-bffc-502c72f76968 req-08d743de-47f3-4459-8303-9e467138b1f0 service nova] Acquired lock "refresh_cache-1728a086-3a73-4157-8c2f-3606820448a9" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.network.neutron [req-1a9058fd-f93a-4aa8-bffc-502c72f76968 req-08d743de-47f3-4459-8303-9e467138b1f0 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Refreshing network info cache for port 1d88c88b-4db7-4078-b68a-941a4a4e7920 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Start _get_guest_xml network_info=[{"id": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "address": "fa:16:3e:14:05:1e", "network": {"id": "13deae10-040e-47ec-a868-0cd1b10e09e1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-566937497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9011aa88afdb40f4be4c2e9846fa72dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d88c88b-4d", "ovs_interfaceid": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:17:48 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:17:48 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:17:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-1538855312',display_name='tempest-SnapshotDataIntegrityTests-server-1538855312',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-1538855312',id=20,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKeZP0p0lwKvXDXvuW52XwDq3F/jdLDEwtvJVwiOp4OTfyWXeIMaaEAMfG/SE3qJxELt/agLszkPj+lhygv4s0kc0XNWg8aaRmcCvDGrSOIMb1gh5CpK3xvghy/XTnLEEA==',key_name='tempest-SnapshotDataIntegrityTests-1799058279',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9011aa88afdb40f4be4c2e9846fa72dd',ramdisk_id='',reservation_id='r-fz3qyeh2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-901788601',owner_user_name='tempest-SnapshotDataIntegrityTests-901788601-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:17:47Z,user_data=None,user_id='4192d379850d40c5b684d8835548acd5',uuid=1728a086-3a73-4157-8c2f-3606820448a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "address": "fa:16:3e:14:05:1e", "network": {"id": "13deae10-040e-47ec-a868-0cd1b10e09e1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-566937497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9011aa88afdb40f4be4c2e9846fa72dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d88c88b-4d", "ovs_interfaceid": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Converting VIF {"id": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "address": "fa:16:3e:14:05:1e", "network": {"id": "13deae10-040e-47ec-a868-0cd1b10e09e1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-566937497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9011aa88afdb40f4be4c2e9846fa72dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d88c88b-4d", "ovs_interfaceid": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:05:1e,bridge_name='br-int',has_traffic_filtering=True,id=1d88c88b-4db7-4078-b68a-941a4a4e7920,network=Network(13deae10-040e-47ec-a868-0cd1b10e09e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d88c88b-4d') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.objects.instance [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lazy-loading 'pci_devices' on Instance uuid 1728a086-3a73-4157-8c2f-3606820448a9 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] End _get_guest_xml xml= Apr 24 00:17:48 user nova-compute[71205]: 1728a086-3a73-4157-8c2f-3606820448a9 Apr 24 00:17:48 user nova-compute[71205]: instance-00000014 Apr 24 00:17:48 user nova-compute[71205]: 131072 Apr 24 00:17:48 user nova-compute[71205]: 1 Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: tempest-SnapshotDataIntegrityTests-server-1538855312 Apr 24 00:17:48 user nova-compute[71205]: 2023-04-24 00:17:48 Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: 128 Apr 24 00:17:48 user nova-compute[71205]: 1 Apr 24 00:17:48 user nova-compute[71205]: 0 Apr 24 00:17:48 user nova-compute[71205]: 0 Apr 24 00:17:48 user nova-compute[71205]: 1 Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: tempest-SnapshotDataIntegrityTests-901788601-project-member Apr 24 00:17:48 user nova-compute[71205]: tempest-SnapshotDataIntegrityTests-901788601 Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: OpenStack Foundation Apr 24 00:17:48 user nova-compute[71205]: OpenStack Nova Apr 24 00:17:48 user nova-compute[71205]: 0.0.0 Apr 24 00:17:48 user nova-compute[71205]: 1728a086-3a73-4157-8c2f-3606820448a9 Apr 24 00:17:48 user nova-compute[71205]: 1728a086-3a73-4157-8c2f-3606820448a9 Apr 24 00:17:48 user nova-compute[71205]: Virtual Machine Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: hvm Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Nehalem Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: /dev/urandom Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: Apr 24 00:17:48 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:17:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-1538855312',display_name='tempest-SnapshotDataIntegrityTests-server-1538855312',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-1538855312',id=20,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKeZP0p0lwKvXDXvuW52XwDq3F/jdLDEwtvJVwiOp4OTfyWXeIMaaEAMfG/SE3qJxELt/agLszkPj+lhygv4s0kc0XNWg8aaRmcCvDGrSOIMb1gh5CpK3xvghy/XTnLEEA==',key_name='tempest-SnapshotDataIntegrityTests-1799058279',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9011aa88afdb40f4be4c2e9846fa72dd',ramdisk_id='',reservation_id='r-fz3qyeh2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-901788601',owner_user_name='tempest-SnapshotDataIntegrityTests-901788601-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:17:47Z,user_data=None,user_id='4192d379850d40c5b684d8835548acd5',uuid=1728a086-3a73-4157-8c2f-3606820448a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "address": "fa:16:3e:14:05:1e", "network": {"id": "13deae10-040e-47ec-a868-0cd1b10e09e1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-566937497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9011aa88afdb40f4be4c2e9846fa72dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d88c88b-4d", "ovs_interfaceid": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Converting VIF {"id": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "address": "fa:16:3e:14:05:1e", "network": {"id": "13deae10-040e-47ec-a868-0cd1b10e09e1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-566937497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9011aa88afdb40f4be4c2e9846fa72dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d88c88b-4d", "ovs_interfaceid": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:05:1e,bridge_name='br-int',has_traffic_filtering=True,id=1d88c88b-4db7-4078-b68a-941a4a4e7920,network=Network(13deae10-040e-47ec-a868-0cd1b10e09e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d88c88b-4d') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG os_vif [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:05:1e,bridge_name='br-int',has_traffic_filtering=True,id=1d88c88b-4db7-4078-b68a-941a4a4e7920,network=Network(13deae10-040e-47ec-a868-0cd1b10e09e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d88c88b-4d') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d88c88b-4d, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1d88c88b-4d, col_values=(('external_ids', {'iface-id': '1d88c88b-4db7-4078-b68a-941a4a4e7920', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:14:05:1e', 'vm-uuid': '1728a086-3a73-4157-8c2f-3606820448a9'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:17:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:49 user nova-compute[71205]: INFO os_vif [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:05:1e,bridge_name='br-int',has_traffic_filtering=True,id=1d88c88b-4db7-4078-b68a-941a4a4e7920,network=Network(13deae10-040e-47ec-a868-0cd1b10e09e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d88c88b-4d') Apr 24 00:17:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:17:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] No VIF found with MAC fa:16:3e:14:05:1e, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:17:49 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:17:49 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:17:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.357s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:49 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Deleted allocations for instance ac38bbc2-2229-4497-b501-e9230ec59a32 Apr 24 00:17:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4bb971ca-f830-4780-9d69-555283b7387b tempest-ServerRescueNegativeTestJSON-487575741 tempest-ServerRescueNegativeTestJSON-487575741-project-member] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.951s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.586s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:49 user nova-compute[71205]: INFO nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] During sync_power_state the instance has a pending task (deleting). Skip. Apr 24 00:17:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:49 user nova-compute[71205]: DEBUG nova.network.neutron [req-1a9058fd-f93a-4aa8-bffc-502c72f76968 req-08d743de-47f3-4459-8303-9e467138b1f0 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Updated VIF entry in instance network info cache for port 1d88c88b-4db7-4078-b68a-941a4a4e7920. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:17:49 user nova-compute[71205]: DEBUG nova.network.neutron [req-1a9058fd-f93a-4aa8-bffc-502c72f76968 req-08d743de-47f3-4459-8303-9e467138b1f0 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Updating instance_info_cache with network_info: [{"id": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "address": "fa:16:3e:14:05:1e", "network": {"id": "13deae10-040e-47ec-a868-0cd1b10e09e1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-566937497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9011aa88afdb40f4be4c2e9846fa72dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d88c88b-4d", "ovs_interfaceid": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:17:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-1a9058fd-f93a-4aa8-bffc-502c72f76968 req-08d743de-47f3-4459-8303-9e467138b1f0 service nova] Releasing lock "refresh_cache-1728a086-3a73-4157-8c2f-3606820448a9" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:17:49 user nova-compute[71205]: DEBUG nova.compute.manager [req-c5776224-fd21-43b0-a938-b441cbe8afa6 req-ca14d39c-5a23-4bee-9171-c8c9213d7c65 service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Received event network-vif-plugged-b716ea04-c5e7-43fc-9f20-5ecb011d6385 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:17:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c5776224-fd21-43b0-a938-b441cbe8afa6 req-ca14d39c-5a23-4bee-9171-c8c9213d7c65 service nova] Acquiring lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c5776224-fd21-43b0-a938-b441cbe8afa6 req-ca14d39c-5a23-4bee-9171-c8c9213d7c65 service nova] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c5776224-fd21-43b0-a938-b441cbe8afa6 req-ca14d39c-5a23-4bee-9171-c8c9213d7c65 service nova] Lock "ac38bbc2-2229-4497-b501-e9230ec59a32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:49 user nova-compute[71205]: DEBUG nova.compute.manager [req-c5776224-fd21-43b0-a938-b441cbe8afa6 req-ca14d39c-5a23-4bee-9171-c8c9213d7c65 service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] No waiting events found dispatching network-vif-plugged-b716ea04-c5e7-43fc-9f20-5ecb011d6385 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:17:49 user nova-compute[71205]: WARNING nova.compute.manager [req-c5776224-fd21-43b0-a938-b441cbe8afa6 req-ca14d39c-5a23-4bee-9171-c8c9213d7c65 service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Received unexpected event network-vif-plugged-b716ea04-c5e7-43fc-9f20-5ecb011d6385 for instance with vm_state deleted and task_state None. Apr 24 00:17:49 user nova-compute[71205]: DEBUG nova.compute.manager [req-c5776224-fd21-43b0-a938-b441cbe8afa6 req-ca14d39c-5a23-4bee-9171-c8c9213d7c65 service nova] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Received event network-vif-deleted-b716ea04-c5e7-43fc-9f20-5ecb011d6385 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:17:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:51 user nova-compute[71205]: DEBUG nova.compute.manager [req-796a3cc7-2dea-4e7d-b07e-8dddaceccaf8 req-cb780c5b-8e0d-44f2-8676-db94a213d8e3 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Received event network-vif-plugged-1d88c88b-4db7-4078-b68a-941a4a4e7920 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:17:51 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-796a3cc7-2dea-4e7d-b07e-8dddaceccaf8 req-cb780c5b-8e0d-44f2-8676-db94a213d8e3 service nova] Acquiring lock "1728a086-3a73-4157-8c2f-3606820448a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:51 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-796a3cc7-2dea-4e7d-b07e-8dddaceccaf8 req-cb780c5b-8e0d-44f2-8676-db94a213d8e3 service nova] Lock "1728a086-3a73-4157-8c2f-3606820448a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:51 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-796a3cc7-2dea-4e7d-b07e-8dddaceccaf8 req-cb780c5b-8e0d-44f2-8676-db94a213d8e3 service nova] Lock "1728a086-3a73-4157-8c2f-3606820448a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:51 user nova-compute[71205]: DEBUG nova.compute.manager [req-796a3cc7-2dea-4e7d-b07e-8dddaceccaf8 req-cb780c5b-8e0d-44f2-8676-db94a213d8e3 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] No waiting events found dispatching network-vif-plugged-1d88c88b-4db7-4078-b68a-941a4a4e7920 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:17:51 user nova-compute[71205]: WARNING nova.compute.manager [req-796a3cc7-2dea-4e7d-b07e-8dddaceccaf8 req-cb780c5b-8e0d-44f2-8676-db94a213d8e3 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Received unexpected event network-vif-plugged-1d88c88b-4db7-4078-b68a-941a4a4e7920 for instance with vm_state building and task_state spawning. Apr 24 00:17:51 user nova-compute[71205]: DEBUG nova.compute.manager [req-796a3cc7-2dea-4e7d-b07e-8dddaceccaf8 req-cb780c5b-8e0d-44f2-8676-db94a213d8e3 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Received event network-vif-plugged-1d88c88b-4db7-4078-b68a-941a4a4e7920 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:17:51 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-796a3cc7-2dea-4e7d-b07e-8dddaceccaf8 req-cb780c5b-8e0d-44f2-8676-db94a213d8e3 service nova] Acquiring lock "1728a086-3a73-4157-8c2f-3606820448a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:17:51 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-796a3cc7-2dea-4e7d-b07e-8dddaceccaf8 req-cb780c5b-8e0d-44f2-8676-db94a213d8e3 service nova] Lock "1728a086-3a73-4157-8c2f-3606820448a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:51 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-796a3cc7-2dea-4e7d-b07e-8dddaceccaf8 req-cb780c5b-8e0d-44f2-8676-db94a213d8e3 service nova] Lock "1728a086-3a73-4157-8c2f-3606820448a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:51 user nova-compute[71205]: DEBUG nova.compute.manager [req-796a3cc7-2dea-4e7d-b07e-8dddaceccaf8 req-cb780c5b-8e0d-44f2-8676-db94a213d8e3 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] No waiting events found dispatching network-vif-plugged-1d88c88b-4db7-4078-b68a-941a4a4e7920 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:17:51 user nova-compute[71205]: WARNING nova.compute.manager [req-796a3cc7-2dea-4e7d-b07e-8dddaceccaf8 req-cb780c5b-8e0d-44f2-8676-db94a213d8e3 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Received unexpected event network-vif-plugged-1d88c88b-4db7-4078-b68a-941a4a4e7920 for instance with vm_state building and task_state spawning. Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:17:52 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] VM Resumed (Lifecycle Event) Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.compute.manager [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:17:52 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Instance spawned successfully. Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:17:52 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:17:52 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] VM Started (Lifecycle Event) Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:17:52 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:17:52 user nova-compute[71205]: INFO nova.compute.manager [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Took 5.80 seconds to spawn the instance on the hypervisor. Apr 24 00:17:52 user nova-compute[71205]: DEBUG nova.compute.manager [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:17:52 user nova-compute[71205]: INFO nova.compute.manager [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Took 6.66 seconds to build instance. Apr 24 00:17:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-838de810-ce42-454c-b2da-ea73f7fe9ccc tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "1728a086-3a73-4157-8c2f-3606820448a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.765s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "1728a086-3a73-4157-8c2f-3606820448a9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 4.122s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:17:52 user nova-compute[71205]: INFO nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:17:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "1728a086-3a73-4157-8c2f-3606820448a9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:17:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:57 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:17:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:01 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:02 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:18:02 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] VM Stopped (Lifecycle Event) Apr 24 00:18:02 user nova-compute[71205]: DEBUG nova.compute.manager [None req-cba35767-7366-417c-bff9-a0e901a4fa31 None None] [instance: ac38bbc2-2229-4497-b501-e9230ec59a32] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:18:03 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:03 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:03 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:13 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:18:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:14 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:18:14 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:18:15 user nova-compute[71205]: INFO nova.compute.claims [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Claim successful on node user Apr 24 00:18:15 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.340s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:18:15 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:18:16 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:18:16 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:18:16 user nova-compute[71205]: INFO nova.virt.block_device [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Booting with volume-backed-image fcf09ead-c5af-40cc-b5cf-92626e181ef9 at /dev/vda Apr 24 00:18:16 user nova-compute[71205]: DEBUG nova.policy [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8d0ab07106dd4995aa7e3f5b6bc70e56', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd26ba1ed4b9241f9a084db1a14a945bb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Updating instance_info_cache with network_info: [{"id": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "address": "fa:16:3e:06:2a:5d", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a3b1d96-2c", "ovs_interfaceid": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:18:16 user nova-compute[71205]: WARNING nova.compute.manager [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Volume id: 1fc4aff5-cded-4845-bcc1-e512a126098b finished being created but its status is error. Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume 1fc4aff5-cded-4845-bcc1-e512a126098b did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Traceback (most recent call last): Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] driver_block_device.attach_block_devices( Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] _log_and_attach(device) Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] bdm.attach(*attach_args, **attach_kwargs) Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] File "/opt/stack/nova/nova/virt/block_device.py", line 831, in attach Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] self.volume_id, self.attachment_id = self._create_volume( Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] with excutils.save_and_reraise_exception(): Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] self.force_reraise() Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] raise self.value Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] wait_func(context, volume_id) Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] nova.exception.VolumeNotCreated: Volume 1fc4aff5-cded-4845-bcc1-e512a126098b did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 24 00:18:16 user nova-compute[71205]: ERROR nova.compute.manager [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:18:16 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:18:17 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Successfully created port: 7b810a2e-d494-43d7-8263-761798c03a31 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:18:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:18:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:18:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json" returned: 0 in 0.159s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:18:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:18:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d/disk --force-share --output=json" returned: 0 in 0.158s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:18:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:18:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:18:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:18:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:18:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:18:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:18:17 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Successfully updated port: 7b810a2e-d494-43d7-8263-761798c03a31 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.compute.manager [req-247767dd-42b2-4094-b0d6-efdc936eedc9 req-75a6f92f-5519-48f6-baac-e09899b8ac77 service nova] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Received event network-changed-7b810a2e-d494-43d7-8263-761798c03a31 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.compute.manager [req-247767dd-42b2-4094-b0d6-efdc936eedc9 req-75a6f92f-5519-48f6-baac-e09899b8ac77 service nova] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Refreshing instance network info cache due to event network-changed-7b810a2e-d494-43d7-8263-761798c03a31. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-247767dd-42b2-4094-b0d6-efdc936eedc9 req-75a6f92f-5519-48f6-baac-e09899b8ac77 service nova] Acquiring lock "refresh_cache-7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-247767dd-42b2-4094-b0d6-efdc936eedc9 req-75a6f92f-5519-48f6-baac-e09899b8ac77 service nova] Acquired lock "refresh_cache-7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.network.neutron [req-247767dd-42b2-4094-b0d6-efdc936eedc9 req-75a6f92f-5519-48f6-baac-e09899b8ac77 service nova] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Refreshing network info cache for port 7b810a2e-d494-43d7-8263-761798c03a31 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:18:18 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:18:18 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=8398MB free_disk=26.52342987060547GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "refresh_cache-7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.network.neutron [req-247767dd-42b2-4094-b0d6-efdc936eedc9 req-75a6f92f-5519-48f6-baac-e09899b8ac77 service nova] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ce19423d-a6ee-4506-9cd1-ec4803abdd86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance cf2c88d5-8347-4166-a037-158f29c32d1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 1c91af3a-b837-4ff0-a236-3483ffe5277d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance a2e58fac-ff2d-47e5-866d-de1f2b741cb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 1728a086-3a73-4157-8c2f-3606820448a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 7 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=1408MB phys_disk=40GB used_disk=6GB total_vcpus=12 used_vcpus=7 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.network.neutron [req-247767dd-42b2-4094-b0d6-efdc936eedc9 req-75a6f92f-5519-48f6-baac-e09899b8ac77 service nova] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-247767dd-42b2-4094-b0d6-efdc936eedc9 req-75a6f92f-5519-48f6-baac-e09899b8ac77 service nova] Releasing lock "refresh_cache-7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquired lock "refresh_cache-7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.383s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Updating instance_info_cache with network_info: [{"id": "7b810a2e-d494-43d7-8263-761798c03a31", "address": "fa:16:3e:87:07:a9", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b810a2e-d4", "ovs_interfaceid": "7b810a2e-d494-43d7-8263-761798c03a31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Releasing lock "refresh_cache-7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Instance network_info: |[{"id": "7b810a2e-d494-43d7-8263-761798c03a31", "address": "fa:16:3e:87:07:a9", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b810a2e-d4", "ovs_interfaceid": "7b810a2e-d494-43d7-8263-761798c03a31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.compute.claims [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Aborting claim: {{(pid=71205) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.321s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Build of instance 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b aborted: Volume 1fc4aff5-cded-4845-bcc1-e512a126098b did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.compute.utils [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Build of instance 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b aborted: Volume 1fc4aff5-cded-4845-bcc1-e512a126098b did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71205) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 24 00:18:19 user nova-compute[71205]: ERROR nova.compute.manager [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Build of instance 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b aborted: Volume 1fc4aff5-cded-4845-bcc1-e512a126098b did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b aborted: Volume 1fc4aff5-cded-4845-bcc1-e512a126098b did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Unplugging VIFs for instance {{(pid=71205) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:18:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2116708494',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-2116708494',id=21,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d26ba1ed4b9241f9a084db1a14a945bb',ramdisk_id='',reservation_id='r-2nrp160w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:18:16Z,user_data=None,user_id='8d0ab07106dd4995aa7e3f5b6bc70e56',uuid=7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b810a2e-d494-43d7-8263-761798c03a31", "address": "fa:16:3e:87:07:a9", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b810a2e-d4", "ovs_interfaceid": "7b810a2e-d494-43d7-8263-761798c03a31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converting VIF {"id": "7b810a2e-d494-43d7-8263-761798c03a31", "address": "fa:16:3e:87:07:a9", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b810a2e-d4", "ovs_interfaceid": "7b810a2e-d494-43d7-8263-761798c03a31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:07:a9,bridge_name='br-int',has_traffic_filtering=True,id=7b810a2e-d494-43d7-8263-761798c03a31,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b810a2e-d4') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG os_vif [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:07:a9,bridge_name='br-int',has_traffic_filtering=True,id=7b810a2e-d494-43d7-8263-761798c03a31,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b810a2e-d4') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b810a2e-d4, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:18:19 user nova-compute[71205]: INFO os_vif [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:07:a9,bridge_name='br-int',has_traffic_filtering=True,id=7b810a2e-d494-43d7-8263-761798c03a31,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b810a2e-d4') Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Unplugged VIFs for instance {{(pid=71205) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:18:19 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:18:20 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:18:20 user nova-compute[71205]: INFO nova.compute.manager [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b] Took 0.56 seconds to deallocate network for instance. Apr 24 00:18:20 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Deleted allocations for instance 7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b Apr 24 00:18:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3eeae232-27ed-4300-b4c6-bd5b3f676e6e tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "7eadcb12-3a4f-49c1-9bcc-8c2f94052e6b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 5.205s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:18:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:21 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:18:21 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:18:21 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:18:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:25 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:18:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:18:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:18:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:18:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:18:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:19:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:19:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:19:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:19:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "1c91af3a-b837-4ff0-a236-3483ffe5277d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:06 user nova-compute[71205]: INFO nova.compute.manager [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Terminating instance Apr 24 00:19:06 user nova-compute[71205]: DEBUG nova.compute.manager [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG nova.compute.manager [req-e238b01d-96df-4cf8-a8fb-5ef2922874fa req-995f92d2-5d0d-45ba-af2d-cd670de9ac6a service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Received event network-vif-unplugged-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-e238b01d-96df-4cf8-a8fb-5ef2922874fa req-995f92d2-5d0d-45ba-af2d-cd670de9ac6a service nova] Acquiring lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-e238b01d-96df-4cf8-a8fb-5ef2922874fa req-995f92d2-5d0d-45ba-af2d-cd670de9ac6a service nova] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-e238b01d-96df-4cf8-a8fb-5ef2922874fa req-995f92d2-5d0d-45ba-af2d-cd670de9ac6a service nova] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG nova.compute.manager [req-e238b01d-96df-4cf8-a8fb-5ef2922874fa req-995f92d2-5d0d-45ba-af2d-cd670de9ac6a service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] No waiting events found dispatching network-vif-unplugged-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG nova.compute.manager [req-e238b01d-96df-4cf8-a8fb-5ef2922874fa req-995f92d2-5d0d-45ba-af2d-cd670de9ac6a service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Received event network-vif-unplugged-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:19:06 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Instance destroyed successfully. Apr 24 00:19:06 user nova-compute[71205]: DEBUG nova.objects.instance [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lazy-loading 'resources' on Instance uuid 1c91af3a-b837-4ff0-a236-3483ffe5277d {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:15:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-515303589',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-515303589',id=18,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-24T00:15:23Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d26ba1ed4b9241f9a084db1a14a945bb',ramdisk_id='',reservation_id='r-b2izt7wx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:17:13Z,user_data=None,user_id='8d0ab07106dd4995aa7e3f5b6bc70e56',uuid=1c91af3a-b837-4ff0-a236-3483ffe5277d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "address": "fa:16:3e:ab:a9:8e", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cef5b8b-d7", "ovs_interfaceid": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converting VIF {"id": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "address": "fa:16:3e:ab:a9:8e", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cef5b8b-d7", "ovs_interfaceid": "7cef5b8b-d733-4f6a-8567-4b9e7e650fbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:a9:8e,bridge_name='br-int',has_traffic_filtering=True,id=7cef5b8b-d733-4f6a-8567-4b9e7e650fbb,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cef5b8b-d7') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG os_vif [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:a9:8e,bridge_name='br-int',has_traffic_filtering=True,id=7cef5b8b-d733-4f6a-8567-4b9e7e650fbb,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cef5b8b-d7') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7cef5b8b-d7, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:19:06 user nova-compute[71205]: INFO os_vif [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:a9:8e,bridge_name='br-int',has_traffic_filtering=True,id=7cef5b8b-d733-4f6a-8567-4b9e7e650fbb,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cef5b8b-d7') Apr 24 00:19:06 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Deleting instance files /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d_del Apr 24 00:19:06 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Deletion of /opt/stack/data/nova/instances/1c91af3a-b837-4ff0-a236-3483ffe5277d_del complete Apr 24 00:19:06 user nova-compute[71205]: INFO nova.compute.manager [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 24 00:19:06 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:19:06 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:19:07 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:19:07 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Took 1.19 seconds to deallocate network for instance. Apr 24 00:19:07 user nova-compute[71205]: DEBUG nova.compute.manager [req-8bbc8dcc-d280-4205-be56-b66cfd223663 req-cb46a5b3-4edb-499b-ae81-49a9eca0638a service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Received event network-vif-deleted-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:19:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:08 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:19:08 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:19:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.327s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:08 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Deleted allocations for instance 1c91af3a-b837-4ff0-a236-3483ffe5277d Apr 24 00:19:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-2d359527-009f-4d8c-82c1-3cac5a19c6d6 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.374s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:08 user nova-compute[71205]: DEBUG nova.compute.manager [req-32b6edc5-b74c-494b-b061-565f13290b84 req-8c571e19-03b7-43f1-87f3-ec3026398718 service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Received event network-vif-plugged-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:19:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-32b6edc5-b74c-494b-b061-565f13290b84 req-8c571e19-03b7-43f1-87f3-ec3026398718 service nova] Acquiring lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-32b6edc5-b74c-494b-b061-565f13290b84 req-8c571e19-03b7-43f1-87f3-ec3026398718 service nova] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-32b6edc5-b74c-494b-b061-565f13290b84 req-8c571e19-03b7-43f1-87f3-ec3026398718 service nova] Lock "1c91af3a-b837-4ff0-a236-3483ffe5277d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:08 user nova-compute[71205]: DEBUG nova.compute.manager [req-32b6edc5-b74c-494b-b061-565f13290b84 req-8c571e19-03b7-43f1-87f3-ec3026398718 service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] No waiting events found dispatching network-vif-plugged-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:19:08 user nova-compute[71205]: WARNING nova.compute.manager [req-32b6edc5-b74c-494b-b061-565f13290b84 req-8c571e19-03b7-43f1-87f3-ec3026398718 service nova] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Received unexpected event network-vif-plugged-7cef5b8b-d733-4f6a-8567-4b9e7e650fbb for instance with vm_state deleted and task_state None. Apr 24 00:19:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:15 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:19:15 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:19:15 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:19:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Updating instance_info_cache with network_info: [{"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.38", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:17 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3/disk --force-share --output=json" returned: 0 in 0.159s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:19:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:19:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:19:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:19:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:19:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:19:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:19:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:19:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:19:19 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:19:19 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:19:19 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=8605MB free_disk=26.56568145751953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:19:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance ce19423d-a6ee-4506-9cd1-ec4803abdd86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance cf2c88d5-8347-4166-a037-158f29c32d1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance a2e58fac-ff2d-47e5-866d-de1f2b741cb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 1728a086-3a73-4157-8c2f-3606820448a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.329s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:19:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:19:21 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:19:21 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] VM Stopped (Lifecycle Event) Apr 24 00:19:21 user nova-compute[71205]: DEBUG nova.compute.manager [None req-1d754910-70ab-49d6-aed6-9f7c6f8f97ba None None] [instance: 1c91af3a-b837-4ff0-a236-3483ffe5277d] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:19:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:22 user nova-compute[71205]: DEBUG nova.compute.manager [req-e515dc65-dbe0-4850-8e84-06d6fd1b9857 req-83791e50-75f9-4a97-9cb8-60b27d182757 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Received event network-changed-bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:19:22 user nova-compute[71205]: DEBUG nova.compute.manager [req-e515dc65-dbe0-4850-8e84-06d6fd1b9857 req-83791e50-75f9-4a97-9cb8-60b27d182757 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Refreshing instance network info cache due to event network-changed-bbf311b8-ca79-433b-9979-dd2ee6102146. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:19:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-e515dc65-dbe0-4850-8e84-06d6fd1b9857 req-83791e50-75f9-4a97-9cb8-60b27d182757 service nova] Acquiring lock "refresh_cache-a2e58fac-ff2d-47e5-866d-de1f2b741cb3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:19:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-e515dc65-dbe0-4850-8e84-06d6fd1b9857 req-83791e50-75f9-4a97-9cb8-60b27d182757 service nova] Acquired lock "refresh_cache-a2e58fac-ff2d-47e5-866d-de1f2b741cb3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:19:22 user nova-compute[71205]: DEBUG nova.network.neutron [req-e515dc65-dbe0-4850-8e84-06d6fd1b9857 req-83791e50-75f9-4a97-9cb8-60b27d182757 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Refreshing network info cache for port bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:19:22 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:22 user nova-compute[71205]: DEBUG nova.network.neutron [req-e515dc65-dbe0-4850-8e84-06d6fd1b9857 req-83791e50-75f9-4a97-9cb8-60b27d182757 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Updated VIF entry in instance network info cache for port bbf311b8-ca79-433b-9979-dd2ee6102146. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:19:22 user nova-compute[71205]: DEBUG nova.network.neutron [req-e515dc65-dbe0-4850-8e84-06d6fd1b9857 req-83791e50-75f9-4a97-9cb8-60b27d182757 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Updating instance_info_cache with network_info: [{"id": "bbf311b8-ca79-433b-9979-dd2ee6102146", "address": "fa:16:3e:90:73:83", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbf311b8-ca", "ovs_interfaceid": "bbf311b8-ca79-433b-9979-dd2ee6102146", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:19:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-e515dc65-dbe0-4850-8e84-06d6fd1b9857 req-83791e50-75f9-4a97-9cb8-60b27d182757 service nova] Releasing lock "refresh_cache-a2e58fac-ff2d-47e5-866d-de1f2b741cb3" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:19:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:23 user nova-compute[71205]: INFO nova.compute.manager [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Terminating instance Apr 24 00:19:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:19:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG nova.compute.manager [req-ee3ff0f1-6dc0-4b1e-8122-caebda62442f req-710b01b1-85c0-4a12-977c-4581de5fc5a7 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Received event network-vif-unplugged-bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ee3ff0f1-6dc0-4b1e-8122-caebda62442f req-710b01b1-85c0-4a12-977c-4581de5fc5a7 service nova] Acquiring lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ee3ff0f1-6dc0-4b1e-8122-caebda62442f req-710b01b1-85c0-4a12-977c-4581de5fc5a7 service nova] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ee3ff0f1-6dc0-4b1e-8122-caebda62442f req-710b01b1-85c0-4a12-977c-4581de5fc5a7 service nova] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG nova.compute.manager [req-ee3ff0f1-6dc0-4b1e-8122-caebda62442f req-710b01b1-85c0-4a12-977c-4581de5fc5a7 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] No waiting events found dispatching network-vif-unplugged-bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG nova.compute.manager [req-ee3ff0f1-6dc0-4b1e-8122-caebda62442f req-710b01b1-85c0-4a12-977c-4581de5fc5a7 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Received event network-vif-unplugged-bbf311b8-ca79-433b-9979-dd2ee6102146 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:19:24 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Instance destroyed successfully. Apr 24 00:19:24 user nova-compute[71205]: DEBUG nova.objects.instance [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lazy-loading 'resources' on Instance uuid a2e58fac-ff2d-47e5-866d-de1f2b741cb3 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:17:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-313901605',display_name='tempest-AttachVolumeNegativeTest-server-313901605',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-313901605',id=19,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL9ANefNai0/rsSQaIoE6MwpNZsPCDJ1hAlE0Sqe0cS0k9aiWxLvZOvWrAlWMxkgj6Ru1wbGkR0RjHJCL/oloMLQjvQ7osVNQYPYNqwz8q0ZbFbyL90CnzPAEEVrQST+9A==',key_name='tempest-keypair-669922603',keypairs=,launch_index=0,launched_at=2023-04-24T00:17:39Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='97d1e8a757a746329ea363af81a3c6b4',ramdisk_id='',reservation_id='r-4yh2fdof',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-272859998',owner_user_name='tempest-AttachVolumeNegativeTest-272859998-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:17:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35edcadbe77c4f4fa8304216e7f61d4a',uuid=a2e58fac-ff2d-47e5-866d-de1f2b741cb3,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bbf311b8-ca79-433b-9979-dd2ee6102146", "address": "fa:16:3e:90:73:83", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbf311b8-ca", "ovs_interfaceid": "bbf311b8-ca79-433b-9979-dd2ee6102146", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converting VIF {"id": "bbf311b8-ca79-433b-9979-dd2ee6102146", "address": "fa:16:3e:90:73:83", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbf311b8-ca", "ovs_interfaceid": "bbf311b8-ca79-433b-9979-dd2ee6102146", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:90:73:83,bridge_name='br-int',has_traffic_filtering=True,id=bbf311b8-ca79-433b-9979-dd2ee6102146,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbbf311b8-ca') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG os_vif [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:73:83,bridge_name='br-int',has_traffic_filtering=True,id=bbf311b8-ca79-433b-9979-dd2ee6102146,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbbf311b8-ca') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbbf311b8-ca, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:19:24 user nova-compute[71205]: INFO os_vif [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:73:83,bridge_name='br-int',has_traffic_filtering=True,id=bbf311b8-ca79-433b-9979-dd2ee6102146,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbbf311b8-ca') Apr 24 00:19:24 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Deleting instance files /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3_del Apr 24 00:19:24 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Deletion of /opt/stack/data/nova/instances/a2e58fac-ff2d-47e5-866d-de1f2b741cb3_del complete Apr 24 00:19:24 user nova-compute[71205]: INFO nova.compute.manager [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 24 00:19:24 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:19:24 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:19:25 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:19:25 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Took 0.98 seconds to deallocate network for instance. Apr 24 00:19:25 user nova-compute[71205]: DEBUG nova.compute.manager [req-e11a3711-0f5b-4427-864c-7b925133ad09 req-843b97ad-5e7e-445f-a56c-0e960fdff7a2 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Received event network-vif-deleted-bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:19:25 user nova-compute[71205]: INFO nova.compute.manager [req-e11a3711-0f5b-4427-864c-7b925133ad09 req-843b97ad-5e7e-445f-a56c-0e960fdff7a2 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Neutron deleted interface bbf311b8-ca79-433b-9979-dd2ee6102146; detaching it from the instance and deleting it from the info cache Apr 24 00:19:25 user nova-compute[71205]: DEBUG nova.network.neutron [req-e11a3711-0f5b-4427-864c-7b925133ad09 req-843b97ad-5e7e-445f-a56c-0e960fdff7a2 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:19:25 user nova-compute[71205]: DEBUG nova.compute.manager [req-e11a3711-0f5b-4427-864c-7b925133ad09 req-843b97ad-5e7e-445f-a56c-0e960fdff7a2 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Detach interface failed, port_id=bbf311b8-ca79-433b-9979-dd2ee6102146, reason: Instance a2e58fac-ff2d-47e5-866d-de1f2b741cb3 could not be found. {{(pid=71205) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 24 00:19:25 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:25 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:19:25 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:19:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.200s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:25 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Deleted allocations for instance a2e58fac-ff2d-47e5-866d-de1f2b741cb3 Apr 24 00:19:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-10085def-6121-437f-950d-964299f7a5f7 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.125s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:26 user nova-compute[71205]: DEBUG nova.compute.manager [req-bba0c61e-e115-4f55-a78b-abc368b9d003 req-dea2d2b2-304b-4858-82de-f76e86f07a62 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Received event network-vif-plugged-bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:19:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bba0c61e-e115-4f55-a78b-abc368b9d003 req-dea2d2b2-304b-4858-82de-f76e86f07a62 service nova] Acquiring lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bba0c61e-e115-4f55-a78b-abc368b9d003 req-dea2d2b2-304b-4858-82de-f76e86f07a62 service nova] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bba0c61e-e115-4f55-a78b-abc368b9d003 req-dea2d2b2-304b-4858-82de-f76e86f07a62 service nova] Lock "a2e58fac-ff2d-47e5-866d-de1f2b741cb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:26 user nova-compute[71205]: DEBUG nova.compute.manager [req-bba0c61e-e115-4f55-a78b-abc368b9d003 req-dea2d2b2-304b-4858-82de-f76e86f07a62 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] No waiting events found dispatching network-vif-plugged-bbf311b8-ca79-433b-9979-dd2ee6102146 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:19:26 user nova-compute[71205]: WARNING nova.compute.manager [req-bba0c61e-e115-4f55-a78b-abc368b9d003 req-dea2d2b2-304b-4858-82de-f76e86f07a62 service nova] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Received unexpected event network-vif-plugged-bbf311b8-ca79-433b-9979-dd2ee6102146 for instance with vm_state deleted and task_state None. Apr 24 00:19:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Acquiring lock "1728a086-3a73-4157-8c2f-3606820448a9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "1728a086-3a73-4157-8c2f-3606820448a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Acquiring lock "1728a086-3a73-4157-8c2f-3606820448a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "1728a086-3a73-4157-8c2f-3606820448a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "1728a086-3a73-4157-8c2f-3606820448a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:38 user nova-compute[71205]: INFO nova.compute.manager [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Terminating instance Apr 24 00:19:38 user nova-compute[71205]: DEBUG nova.compute.manager [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG nova.compute.manager [req-780e408e-9f7c-490a-b5d3-ec6715a59822 req-b392cb0b-935b-41a9-a82a-fcdeb11cf231 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Received event network-vif-unplugged-1d88c88b-4db7-4078-b68a-941a4a4e7920 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-780e408e-9f7c-490a-b5d3-ec6715a59822 req-b392cb0b-935b-41a9-a82a-fcdeb11cf231 service nova] Acquiring lock "1728a086-3a73-4157-8c2f-3606820448a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-780e408e-9f7c-490a-b5d3-ec6715a59822 req-b392cb0b-935b-41a9-a82a-fcdeb11cf231 service nova] Lock "1728a086-3a73-4157-8c2f-3606820448a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-780e408e-9f7c-490a-b5d3-ec6715a59822 req-b392cb0b-935b-41a9-a82a-fcdeb11cf231 service nova] Lock "1728a086-3a73-4157-8c2f-3606820448a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG nova.compute.manager [req-780e408e-9f7c-490a-b5d3-ec6715a59822 req-b392cb0b-935b-41a9-a82a-fcdeb11cf231 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] No waiting events found dispatching network-vif-unplugged-1d88c88b-4db7-4078-b68a-941a4a4e7920 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG nova.compute.manager [req-780e408e-9f7c-490a-b5d3-ec6715a59822 req-b392cb0b-935b-41a9-a82a-fcdeb11cf231 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Received event network-vif-unplugged-1d88c88b-4db7-4078-b68a-941a4a4e7920 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:38 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Instance destroyed successfully. Apr 24 00:19:38 user nova-compute[71205]: DEBUG nova.objects.instance [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lazy-loading 'resources' on Instance uuid 1728a086-3a73-4157-8c2f-3606820448a9 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:17:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-1538855312',display_name='tempest-SnapshotDataIntegrityTests-server-1538855312',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-1538855312',id=20,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKeZP0p0lwKvXDXvuW52XwDq3F/jdLDEwtvJVwiOp4OTfyWXeIMaaEAMfG/SE3qJxELt/agLszkPj+lhygv4s0kc0XNWg8aaRmcCvDGrSOIMb1gh5CpK3xvghy/XTnLEEA==',key_name='tempest-SnapshotDataIntegrityTests-1799058279',keypairs=,launch_index=0,launched_at=2023-04-24T00:17:52Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='9011aa88afdb40f4be4c2e9846fa72dd',ramdisk_id='',reservation_id='r-fz3qyeh2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-SnapshotDataIntegrityTests-901788601',owner_user_name='tempest-SnapshotDataIntegrityTests-901788601-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:17:53Z,user_data=None,user_id='4192d379850d40c5b684d8835548acd5',uuid=1728a086-3a73-4157-8c2f-3606820448a9,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "address": "fa:16:3e:14:05:1e", "network": {"id": "13deae10-040e-47ec-a868-0cd1b10e09e1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-566937497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9011aa88afdb40f4be4c2e9846fa72dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d88c88b-4d", "ovs_interfaceid": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Converting VIF {"id": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "address": "fa:16:3e:14:05:1e", "network": {"id": "13deae10-040e-47ec-a868-0cd1b10e09e1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-566937497-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9011aa88afdb40f4be4c2e9846fa72dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d88c88b-4d", "ovs_interfaceid": "1d88c88b-4db7-4078-b68a-941a4a4e7920", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:05:1e,bridge_name='br-int',has_traffic_filtering=True,id=1d88c88b-4db7-4078-b68a-941a4a4e7920,network=Network(13deae10-040e-47ec-a868-0cd1b10e09e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d88c88b-4d') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG os_vif [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:05:1e,bridge_name='br-int',has_traffic_filtering=True,id=1d88c88b-4db7-4078-b68a-941a4a4e7920,network=Network(13deae10-040e-47ec-a868-0cd1b10e09e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d88c88b-4d') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d88c88b-4d, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:19:38 user nova-compute[71205]: INFO os_vif [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:05:1e,bridge_name='br-int',has_traffic_filtering=True,id=1d88c88b-4db7-4078-b68a-941a4a4e7920,network=Network(13deae10-040e-47ec-a868-0cd1b10e09e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d88c88b-4d') Apr 24 00:19:38 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Deleting instance files /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9_del Apr 24 00:19:38 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Deletion of /opt/stack/data/nova/instances/1728a086-3a73-4157-8c2f-3606820448a9_del complete Apr 24 00:19:38 user nova-compute[71205]: INFO nova.compute.manager [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 24 00:19:38 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:19:38 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:19:39 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:19:39 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Took 0.53 seconds to deallocate network for instance. Apr 24 00:19:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:39 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:19:39 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:19:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.179s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:39 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:19:39 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] VM Stopped (Lifecycle Event) Apr 24 00:19:39 user nova-compute[71205]: DEBUG nova.compute.manager [None req-b57ab8cc-6200-4f2b-9ad5-23e6a0ca9fbb None None] [instance: a2e58fac-ff2d-47e5-866d-de1f2b741cb3] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:19:39 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Deleted allocations for instance 1728a086-3a73-4157-8c2f-3606820448a9 Apr 24 00:19:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d8dddce-5407-4b8b-8af4-1ad04432d6ee tempest-SnapshotDataIntegrityTests-901788601 tempest-SnapshotDataIntegrityTests-901788601-project-member] Lock "1728a086-3a73-4157-8c2f-3606820448a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.564s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:40 user nova-compute[71205]: DEBUG nova.compute.manager [req-98612458-8a25-4cec-8a6e-b04ff3730ffb req-c769f142-e68c-404d-8c62-c64cc4142422 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Received event network-vif-plugged-1d88c88b-4db7-4078-b68a-941a4a4e7920 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:19:40 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-98612458-8a25-4cec-8a6e-b04ff3730ffb req-c769f142-e68c-404d-8c62-c64cc4142422 service nova] Acquiring lock "1728a086-3a73-4157-8c2f-3606820448a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:40 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-98612458-8a25-4cec-8a6e-b04ff3730ffb req-c769f142-e68c-404d-8c62-c64cc4142422 service nova] Lock "1728a086-3a73-4157-8c2f-3606820448a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:40 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-98612458-8a25-4cec-8a6e-b04ff3730ffb req-c769f142-e68c-404d-8c62-c64cc4142422 service nova] Lock "1728a086-3a73-4157-8c2f-3606820448a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:40 user nova-compute[71205]: DEBUG nova.compute.manager [req-98612458-8a25-4cec-8a6e-b04ff3730ffb req-c769f142-e68c-404d-8c62-c64cc4142422 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] No waiting events found dispatching network-vif-plugged-1d88c88b-4db7-4078-b68a-941a4a4e7920 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:19:40 user nova-compute[71205]: WARNING nova.compute.manager [req-98612458-8a25-4cec-8a6e-b04ff3730ffb req-c769f142-e68c-404d-8c62-c64cc4142422 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Received unexpected event network-vif-plugged-1d88c88b-4db7-4078-b68a-941a4a4e7920 for instance with vm_state deleted and task_state None. Apr 24 00:19:40 user nova-compute[71205]: DEBUG nova.compute.manager [req-98612458-8a25-4cec-8a6e-b04ff3730ffb req-c769f142-e68c-404d-8c62-c64cc4142422 service nova] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Received event network-vif-deleted-1d88c88b-4db7-4078-b68a-941a4a4e7920 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:19:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:19:53 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:19:53 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] VM Stopped (Lifecycle Event) Apr 24 00:19:53 user nova-compute[71205]: DEBUG nova.compute.manager [None req-2d0b1ddd-c637-491f-a7c6-e298f11231b3 None None] [instance: 1728a086-3a73-4157-8c2f-3606820448a9] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:19:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:19:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:19:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:19:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:19:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:56 user nova-compute[71205]: INFO nova.compute.manager [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Terminating instance Apr 24 00:19:56 user nova-compute[71205]: DEBUG nova.compute.manager [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:19:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG nova.compute.manager [req-ca663ec4-35f4-44b1-8e64-22146b07ef71 req-587236a1-6d93-4373-bd20-9aa1ea7d8e1b service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Received event network-vif-unplugged-7a3b1d96-2c84-4994-b698-c59fb56c44f8 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ca663ec4-35f4-44b1-8e64-22146b07ef71 req-587236a1-6d93-4373-bd20-9aa1ea7d8e1b service nova] Acquiring lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ca663ec4-35f4-44b1-8e64-22146b07ef71 req-587236a1-6d93-4373-bd20-9aa1ea7d8e1b service nova] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ca663ec4-35f4-44b1-8e64-22146b07ef71 req-587236a1-6d93-4373-bd20-9aa1ea7d8e1b service nova] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG nova.compute.manager [req-ca663ec4-35f4-44b1-8e64-22146b07ef71 req-587236a1-6d93-4373-bd20-9aa1ea7d8e1b service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] No waiting events found dispatching network-vif-unplugged-7a3b1d96-2c84-4994-b698-c59fb56c44f8 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG nova.compute.manager [req-ca663ec4-35f4-44b1-8e64-22146b07ef71 req-587236a1-6d93-4373-bd20-9aa1ea7d8e1b service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Received event network-vif-unplugged-7a3b1d96-2c84-4994-b698-c59fb56c44f8 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:57 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Instance destroyed successfully. Apr 24 00:19:57 user nova-compute[71205]: DEBUG nova.objects.instance [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lazy-loading 'resources' on Instance uuid c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1137356929',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1137356929',id=6,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-24T00:11:42Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d26ba1ed4b9241f9a084db1a14a945bb',ramdisk_id='',reservation_id='r-jniph3u4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:13:24Z,user_data=None,user_id='8d0ab07106dd4995aa7e3f5b6bc70e56',uuid=c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "address": "fa:16:3e:06:2a:5d", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a3b1d96-2c", "ovs_interfaceid": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converting VIF {"id": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "address": "fa:16:3e:06:2a:5d", "network": {"id": "52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-348197869-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d26ba1ed4b9241f9a084db1a14a945bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a3b1d96-2c", "ovs_interfaceid": "7a3b1d96-2c84-4994-b698-c59fb56c44f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:06:2a:5d,bridge_name='br-int',has_traffic_filtering=True,id=7a3b1d96-2c84-4994-b698-c59fb56c44f8,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a3b1d96-2c') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG os_vif [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:2a:5d,bridge_name='br-int',has_traffic_filtering=True,id=7a3b1d96-2c84-4994-b698-c59fb56c44f8,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a3b1d96-2c') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a3b1d96-2c, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:19:57 user nova-compute[71205]: INFO os_vif [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:2a:5d,bridge_name='br-int',has_traffic_filtering=True,id=7a3b1d96-2c84-4994-b698-c59fb56c44f8,network=Network(52d3b3b9-3aa8-4e9f-96ed-acbfc8a7b461),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a3b1d96-2c') Apr 24 00:19:57 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Deleting instance files /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b_del Apr 24 00:19:57 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Deletion of /opt/stack/data/nova/instances/c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b_del complete Apr 24 00:19:57 user nova-compute[71205]: INFO nova.compute.manager [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Took 0.63 seconds to destroy the instance on the hypervisor. Apr 24 00:19:57 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:19:57 user nova-compute[71205]: DEBUG nova.compute.manager [req-fa11a416-bf85-4269-88dd-9d75765131fe req-43607460-7b1d-452b-942a-ba11f6111330 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Received event network-vif-deleted-7a3b1d96-2c84-4994-b698-c59fb56c44f8 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:19:57 user nova-compute[71205]: INFO nova.compute.manager [req-fa11a416-bf85-4269-88dd-9d75765131fe req-43607460-7b1d-452b-942a-ba11f6111330 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Neutron deleted interface 7a3b1d96-2c84-4994-b698-c59fb56c44f8; detaching it from the instance and deleting it from the info cache Apr 24 00:19:57 user nova-compute[71205]: DEBUG nova.network.neutron [req-fa11a416-bf85-4269-88dd-9d75765131fe req-43607460-7b1d-452b-942a-ba11f6111330 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:19:57 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Took 0.53 seconds to deallocate network for instance. Apr 24 00:19:58 user nova-compute[71205]: DEBUG nova.compute.manager [req-fa11a416-bf85-4269-88dd-9d75765131fe req-43607460-7b1d-452b-942a-ba11f6111330 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Detach interface failed, port_id=7a3b1d96-2c84-4994-b698-c59fb56c44f8, reason: Instance c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b could not be found. {{(pid=71205) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 24 00:19:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:58 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:19:58 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:19:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.155s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:58 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Deleted allocations for instance c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b Apr 24 00:19:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-01d834ca-3430-4702-8d16-6b507cf25145 tempest-ServerBootFromVolumeStableRescueTest-2021792443 tempest-ServerBootFromVolumeStableRescueTest-2021792443-project-member] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.479s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:59 user nova-compute[71205]: DEBUG nova.compute.manager [req-b8b04144-8e63-4db3-baa9-83bda2dbfd56 req-1424ec26-ba7f-4c4a-9b70-6ed0007d02a1 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Received event network-vif-plugged-7a3b1d96-2c84-4994-b698-c59fb56c44f8 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:19:59 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b8b04144-8e63-4db3-baa9-83bda2dbfd56 req-1424ec26-ba7f-4c4a-9b70-6ed0007d02a1 service nova] Acquiring lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:19:59 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b8b04144-8e63-4db3-baa9-83bda2dbfd56 req-1424ec26-ba7f-4c4a-9b70-6ed0007d02a1 service nova] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:19:59 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b8b04144-8e63-4db3-baa9-83bda2dbfd56 req-1424ec26-ba7f-4c4a-9b70-6ed0007d02a1 service nova] Lock "c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:19:59 user nova-compute[71205]: DEBUG nova.compute.manager [req-b8b04144-8e63-4db3-baa9-83bda2dbfd56 req-1424ec26-ba7f-4c4a-9b70-6ed0007d02a1 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] No waiting events found dispatching network-vif-plugged-7a3b1d96-2c84-4994-b698-c59fb56c44f8 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:19:59 user nova-compute[71205]: WARNING nova.compute.manager [req-b8b04144-8e63-4db3-baa9-83bda2dbfd56 req-1424ec26-ba7f-4c4a-9b70-6ed0007d02a1 service nova] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Received unexpected event network-vif-plugged-7a3b1d96-2c84-4994-b698-c59fb56c44f8 for instance with vm_state deleted and task_state None. Apr 24 00:20:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:00 user nova-compute[71205]: INFO nova.compute.manager [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Terminating instance Apr 24 00:20:00 user nova-compute[71205]: DEBUG nova.compute.manager [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:20:00 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:00 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:00 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:00 user nova-compute[71205]: DEBUG nova.compute.manager [req-1f80b470-982c-4bf0-9f21-14381c1aa53c req-b8ecb232-aebf-473e-883d-dc3d0c2fd159 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Received event network-vif-unplugged-f6c98734-17ee-42d0-9372-fd76526b0b27 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:20:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-1f80b470-982c-4bf0-9f21-14381c1aa53c req-b8ecb232-aebf-473e-883d-dc3d0c2fd159 service nova] Acquiring lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-1f80b470-982c-4bf0-9f21-14381c1aa53c req-b8ecb232-aebf-473e-883d-dc3d0c2fd159 service nova] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:00 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-1f80b470-982c-4bf0-9f21-14381c1aa53c req-b8ecb232-aebf-473e-883d-dc3d0c2fd159 service nova] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:00 user nova-compute[71205]: DEBUG nova.compute.manager [req-1f80b470-982c-4bf0-9f21-14381c1aa53c req-b8ecb232-aebf-473e-883d-dc3d0c2fd159 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] No waiting events found dispatching network-vif-unplugged-f6c98734-17ee-42d0-9372-fd76526b0b27 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:20:00 user nova-compute[71205]: DEBUG nova.compute.manager [req-1f80b470-982c-4bf0-9f21-14381c1aa53c req-b8ecb232-aebf-473e-883d-dc3d0c2fd159 service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Received event network-vif-unplugged-f6c98734-17ee-42d0-9372-fd76526b0b27 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:20:01 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Instance destroyed successfully. Apr 24 00:20:01 user nova-compute[71205]: DEBUG nova.objects.instance [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lazy-loading 'resources' on Instance uuid ce19423d-a6ee-4506-9cd1-ec4803abdd86 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:20:01 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-109140451',display_name='tempest-ServersNegativeTestJSON-server-109140451',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-109140451',id=4,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-24T00:11:37Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='ce75f63fc0904eceb03e8319bddba4d3',ramdisk_id='',reservation_id='r-e7rl04pc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-380105770',owner_user_name='tempest-ServersNegativeTestJSON-380105770-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:11:38Z,user_data=None,user_id='abae98323deb44dea0622186485cc7af',uuid=ce19423d-a6ee-4506-9cd1-ec4803abdd86,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6c98734-17ee-42d0-9372-fd76526b0b27", "address": "fa:16:3e:72:c7:21", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6c98734-17", "ovs_interfaceid": "f6c98734-17ee-42d0-9372-fd76526b0b27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:20:01 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Converting VIF {"id": "f6c98734-17ee-42d0-9372-fd76526b0b27", "address": "fa:16:3e:72:c7:21", "network": {"id": "2b9edd97-2af0-4b09-993c-95cc2b427fd8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1649565193-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ce75f63fc0904eceb03e8319bddba4d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6c98734-17", "ovs_interfaceid": "f6c98734-17ee-42d0-9372-fd76526b0b27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:20:01 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:72:c7:21,bridge_name='br-int',has_traffic_filtering=True,id=f6c98734-17ee-42d0-9372-fd76526b0b27,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6c98734-17') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:20:01 user nova-compute[71205]: DEBUG os_vif [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:72:c7:21,bridge_name='br-int',has_traffic_filtering=True,id=f6c98734-17ee-42d0-9372-fd76526b0b27,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6c98734-17') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:20:01 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:01 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6c98734-17, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:20:01 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:01 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:20:01 user nova-compute[71205]: INFO os_vif [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:72:c7:21,bridge_name='br-int',has_traffic_filtering=True,id=f6c98734-17ee-42d0-9372-fd76526b0b27,network=Network(2b9edd97-2af0-4b09-993c-95cc2b427fd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6c98734-17') Apr 24 00:20:01 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Deleting instance files /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86_del Apr 24 00:20:01 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Deletion of /opt/stack/data/nova/instances/ce19423d-a6ee-4506-9cd1-ec4803abdd86_del complete Apr 24 00:20:01 user nova-compute[71205]: INFO nova.compute.manager [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 24 00:20:01 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:20:01 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:20:01 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:20:02 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:20:02 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Took 0.89 seconds to deallocate network for instance. Apr 24 00:20:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:02 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:20:02 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:20:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.147s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:02 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Deleted allocations for instance ce19423d-a6ee-4506-9cd1-ec4803abdd86 Apr 24 00:20:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-a002df15-d73b-474f-9d1b-46300ecc957a tempest-ServersNegativeTestJSON-380105770 tempest-ServersNegativeTestJSON-380105770-project-member] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.024s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:02 user nova-compute[71205]: DEBUG nova.compute.manager [req-695fec98-a39d-41d3-a9ae-6bb921938f28 req-8e7ce0bc-9deb-4ca7-be05-db92dd656b2a service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Received event network-vif-plugged-f6c98734-17ee-42d0-9372-fd76526b0b27 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:20:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-695fec98-a39d-41d3-a9ae-6bb921938f28 req-8e7ce0bc-9deb-4ca7-be05-db92dd656b2a service nova] Acquiring lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-695fec98-a39d-41d3-a9ae-6bb921938f28 req-8e7ce0bc-9deb-4ca7-be05-db92dd656b2a service nova] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-695fec98-a39d-41d3-a9ae-6bb921938f28 req-8e7ce0bc-9deb-4ca7-be05-db92dd656b2a service nova] Lock "ce19423d-a6ee-4506-9cd1-ec4803abdd86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:02 user nova-compute[71205]: DEBUG nova.compute.manager [req-695fec98-a39d-41d3-a9ae-6bb921938f28 req-8e7ce0bc-9deb-4ca7-be05-db92dd656b2a service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] No waiting events found dispatching network-vif-plugged-f6c98734-17ee-42d0-9372-fd76526b0b27 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:20:02 user nova-compute[71205]: WARNING nova.compute.manager [req-695fec98-a39d-41d3-a9ae-6bb921938f28 req-8e7ce0bc-9deb-4ca7-be05-db92dd656b2a service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Received unexpected event network-vif-plugged-f6c98734-17ee-42d0-9372-fd76526b0b27 for instance with vm_state deleted and task_state None. Apr 24 00:20:02 user nova-compute[71205]: DEBUG nova.compute.manager [req-695fec98-a39d-41d3-a9ae-6bb921938f28 req-8e7ce0bc-9deb-4ca7-be05-db92dd656b2a service nova] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Received event network-vif-deleted-f6c98734-17ee-42d0-9372-fd76526b0b27 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:20:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:20:12 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:20:12 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] VM Stopped (Lifecycle Event) Apr 24 00:20:12 user nova-compute[71205]: DEBUG nova.compute.manager [None req-9807ac8c-23c9-4bff-ab65-4719de1ee2c0 None None] [instance: c4ed916c-8dc9-4fbd-a18f-ca7428e80f5b] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:20:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:15 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:20:16 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:20:16 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] VM Stopped (Lifecycle Event) Apr 24 00:20:16 user nova-compute[71205]: DEBUG nova.compute.manager [None req-0cbc417b-de32-4e11-b9cb-733d28152972 None None] [instance: ce19423d-a6ee-4506-9cd1-ec4803abdd86] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:20:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:20:17 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:20:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:20:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "88bc0884-174c-402c-9cbd-7099895f27a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:17 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "88bc0884-174c-402c-9cbd-7099895f27a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:17 user nova-compute[71205]: DEBUG nova.compute.manager [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:20:18 user nova-compute[71205]: INFO nova.compute.claims [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Claim successful on node user Apr 24 00:20:18 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG nova.network.neutron [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:20:18 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:20:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG nova.policy [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '35edcadbe77c4f4fa8304216e7f61d4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '97d1e8a757a746329ea363af81a3c6b4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:20:18 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Creating image(s) Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "/opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "/opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "/opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.133s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:20:18 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.425s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:20:19 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:20:19 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=8930MB free_disk=26.662368774414062GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/disk 1073741824" returned: 0 in 0.054s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.484s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.network.neutron [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Successfully created port: 904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance cf2c88d5-8347-4166-a037-158f29c32d1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 88bc0884-174c-402c-9cbd-7099895f27a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.133s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Checking if we can resize image /opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Cannot resize image /opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.objects.instance [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lazy-loading 'migration_context' on Instance uuid 88bc0884-174c-402c-9cbd-7099895f27a0 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Ensure instance console log exists: /opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.294s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.network.neutron [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Successfully updated port: 904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "refresh_cache-88bc0884-174c-402c-9cbd-7099895f27a0" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquired lock "refresh_cache-88bc0884-174c-402c-9cbd-7099895f27a0" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:20:19 user nova-compute[71205]: DEBUG nova.network.neutron [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.compute.manager [req-f32f244e-c11b-4462-b004-f3965f86ad0a req-2b5f78ad-181a-4999-a54d-55ed02506d7c service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Received event network-changed-904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.compute.manager [req-f32f244e-c11b-4462-b004-f3965f86ad0a req-2b5f78ad-181a-4999-a54d-55ed02506d7c service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Refreshing instance network info cache due to event network-changed-904cb737-715a-45a2-8c4e-e99e90596e6e. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f32f244e-c11b-4462-b004-f3965f86ad0a req-2b5f78ad-181a-4999-a54d-55ed02506d7c service nova] Acquiring lock "refresh_cache-88bc0884-174c-402c-9cbd-7099895f27a0" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.network.neutron [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.network.neutron [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Updating instance_info_cache with network_info: [{"id": "904cb737-715a-45a2-8c4e-e99e90596e6e", "address": "fa:16:3e:d1:cc:51", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap904cb737-71", "ovs_interfaceid": "904cb737-715a-45a2-8c4e-e99e90596e6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Releasing lock "refresh_cache-88bc0884-174c-402c-9cbd-7099895f27a0" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.compute.manager [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Instance network_info: |[{"id": "904cb737-715a-45a2-8c4e-e99e90596e6e", "address": "fa:16:3e:d1:cc:51", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap904cb737-71", "ovs_interfaceid": "904cb737-715a-45a2-8c4e-e99e90596e6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f32f244e-c11b-4462-b004-f3965f86ad0a req-2b5f78ad-181a-4999-a54d-55ed02506d7c service nova] Acquired lock "refresh_cache-88bc0884-174c-402c-9cbd-7099895f27a0" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.network.neutron [req-f32f244e-c11b-4462-b004-f3965f86ad0a req-2b5f78ad-181a-4999-a54d-55ed02506d7c service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Refreshing network info cache for port 904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Start _get_guest_xml network_info=[{"id": "904cb737-715a-45a2-8c4e-e99e90596e6e", "address": "fa:16:3e:d1:cc:51", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap904cb737-71", "ovs_interfaceid": "904cb737-715a-45a2-8c4e-e99e90596e6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:20:20 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:20:20 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:20:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-806357688',display_name='tempest-AttachVolumeNegativeTest-server-806357688',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-806357688',id=22,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIS7J2r3hvdb0rW9oq624HJm6iEzqAHNFXYccC2ziX4/UQM6xe9IVnyYzcYeVB3AHvem3MrTEL+AAJWg7X9ws6qIEWDFHnxd+tlDDo0bvhx6gq7aYfsNiTDpRmFhx0Gb2g==',key_name='tempest-keypair-1028661157',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='97d1e8a757a746329ea363af81a3c6b4',ramdisk_id='',reservation_id='r-khqn9ba6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-272859998',owner_user_name='tempest-AttachVolumeNegativeTest-272859998-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:20:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35edcadbe77c4f4fa8304216e7f61d4a',uuid=88bc0884-174c-402c-9cbd-7099895f27a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "904cb737-715a-45a2-8c4e-e99e90596e6e", "address": "fa:16:3e:d1:cc:51", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap904cb737-71", "ovs_interfaceid": "904cb737-715a-45a2-8c4e-e99e90596e6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converting VIF {"id": "904cb737-715a-45a2-8c4e-e99e90596e6e", "address": "fa:16:3e:d1:cc:51", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap904cb737-71", "ovs_interfaceid": "904cb737-715a-45a2-8c4e-e99e90596e6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:cc:51,bridge_name='br-int',has_traffic_filtering=True,id=904cb737-715a-45a2-8c4e-e99e90596e6e,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap904cb737-71') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.objects.instance [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lazy-loading 'pci_devices' on Instance uuid 88bc0884-174c-402c-9cbd-7099895f27a0 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] End _get_guest_xml xml= Apr 24 00:20:20 user nova-compute[71205]: 88bc0884-174c-402c-9cbd-7099895f27a0 Apr 24 00:20:20 user nova-compute[71205]: instance-00000016 Apr 24 00:20:20 user nova-compute[71205]: 131072 Apr 24 00:20:20 user nova-compute[71205]: 1 Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: tempest-AttachVolumeNegativeTest-server-806357688 Apr 24 00:20:20 user nova-compute[71205]: 2023-04-24 00:20:20 Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: 128 Apr 24 00:20:20 user nova-compute[71205]: 1 Apr 24 00:20:20 user nova-compute[71205]: 0 Apr 24 00:20:20 user nova-compute[71205]: 0 Apr 24 00:20:20 user nova-compute[71205]: 1 Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: tempest-AttachVolumeNegativeTest-272859998-project-member Apr 24 00:20:20 user nova-compute[71205]: tempest-AttachVolumeNegativeTest-272859998 Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: OpenStack Foundation Apr 24 00:20:20 user nova-compute[71205]: OpenStack Nova Apr 24 00:20:20 user nova-compute[71205]: 0.0.0 Apr 24 00:20:20 user nova-compute[71205]: 88bc0884-174c-402c-9cbd-7099895f27a0 Apr 24 00:20:20 user nova-compute[71205]: 88bc0884-174c-402c-9cbd-7099895f27a0 Apr 24 00:20:20 user nova-compute[71205]: Virtual Machine Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: hvm Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Nehalem Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: /dev/urandom Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: Apr 24 00:20:20 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:20:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-806357688',display_name='tempest-AttachVolumeNegativeTest-server-806357688',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-806357688',id=22,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIS7J2r3hvdb0rW9oq624HJm6iEzqAHNFXYccC2ziX4/UQM6xe9IVnyYzcYeVB3AHvem3MrTEL+AAJWg7X9ws6qIEWDFHnxd+tlDDo0bvhx6gq7aYfsNiTDpRmFhx0Gb2g==',key_name='tempest-keypair-1028661157',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='97d1e8a757a746329ea363af81a3c6b4',ramdisk_id='',reservation_id='r-khqn9ba6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-272859998',owner_user_name='tempest-AttachVolumeNegativeTest-272859998-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:20:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35edcadbe77c4f4fa8304216e7f61d4a',uuid=88bc0884-174c-402c-9cbd-7099895f27a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "904cb737-715a-45a2-8c4e-e99e90596e6e", "address": "fa:16:3e:d1:cc:51", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap904cb737-71", "ovs_interfaceid": "904cb737-715a-45a2-8c4e-e99e90596e6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converting VIF {"id": "904cb737-715a-45a2-8c4e-e99e90596e6e", "address": "fa:16:3e:d1:cc:51", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap904cb737-71", "ovs_interfaceid": "904cb737-715a-45a2-8c4e-e99e90596e6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:cc:51,bridge_name='br-int',has_traffic_filtering=True,id=904cb737-715a-45a2-8c4e-e99e90596e6e,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap904cb737-71') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG os_vif [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:cc:51,bridge_name='br-int',has_traffic_filtering=True,id=904cb737-715a-45a2-8c4e-e99e90596e6e,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap904cb737-71') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap904cb737-71, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap904cb737-71, col_values=(('external_ids', {'iface-id': '904cb737-715a-45a2-8c4e-e99e90596e6e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:cc:51', 'vm-uuid': '88bc0884-174c-402c-9cbd-7099895f27a0'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:20 user nova-compute[71205]: INFO os_vif [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:cc:51,bridge_name='br-int',has_traffic_filtering=True,id=904cb737-715a-45a2-8c4e-e99e90596e6e,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap904cb737-71') Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Didn't find any instances for network info cache update. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] No VIF found with MAC fa:16:3e:d1:cc:51, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.network.neutron [req-f32f244e-c11b-4462-b004-f3965f86ad0a req-2b5f78ad-181a-4999-a54d-55ed02506d7c service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Updated VIF entry in instance network info cache for port 904cb737-715a-45a2-8c4e-e99e90596e6e. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG nova.network.neutron [req-f32f244e-c11b-4462-b004-f3965f86ad0a req-2b5f78ad-181a-4999-a54d-55ed02506d7c service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Updating instance_info_cache with network_info: [{"id": "904cb737-715a-45a2-8c4e-e99e90596e6e", "address": "fa:16:3e:d1:cc:51", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap904cb737-71", "ovs_interfaceid": "904cb737-715a-45a2-8c4e-e99e90596e6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:20:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f32f244e-c11b-4462-b004-f3965f86ad0a req-2b5f78ad-181a-4999-a54d-55ed02506d7c service nova] Releasing lock "refresh_cache-88bc0884-174c-402c-9cbd-7099895f27a0" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:20:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:22 user nova-compute[71205]: DEBUG nova.compute.manager [req-71a3e93f-1a4b-400d-bb85-a45db2ec1e34 req-2afde192-201f-4ada-b038-ed8b6487af8e service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Received event network-vif-plugged-904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:20:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-71a3e93f-1a4b-400d-bb85-a45db2ec1e34 req-2afde192-201f-4ada-b038-ed8b6487af8e service nova] Acquiring lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-71a3e93f-1a4b-400d-bb85-a45db2ec1e34 req-2afde192-201f-4ada-b038-ed8b6487af8e service nova] Lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-71a3e93f-1a4b-400d-bb85-a45db2ec1e34 req-2afde192-201f-4ada-b038-ed8b6487af8e service nova] Lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:22 user nova-compute[71205]: DEBUG nova.compute.manager [req-71a3e93f-1a4b-400d-bb85-a45db2ec1e34 req-2afde192-201f-4ada-b038-ed8b6487af8e service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] No waiting events found dispatching network-vif-plugged-904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:20:22 user nova-compute[71205]: WARNING nova.compute.manager [req-71a3e93f-1a4b-400d-bb85-a45db2ec1e34 req-2afde192-201f-4ada-b038-ed8b6487af8e service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Received unexpected event network-vif-plugged-904cb737-715a-45a2-8c4e-e99e90596e6e for instance with vm_state building and task_state spawning. Apr 24 00:20:22 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:22 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:22 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:20:23 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] VM Resumed (Lifecycle Event) Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:20:23 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Instance spawned successfully. Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:20:23 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:20:23 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] VM Started (Lifecycle Event) Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:20:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:20:24 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:20:24 user nova-compute[71205]: INFO nova.compute.manager [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Took 5.52 seconds to spawn the instance on the hypervisor. Apr 24 00:20:24 user nova-compute[71205]: DEBUG nova.compute.manager [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:20:24 user nova-compute[71205]: INFO nova.compute.manager [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Took 6.08 seconds to build instance. Apr 24 00:20:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-26f968e3-a8c4-4263-a69a-b0dc8297dc69 tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "88bc0884-174c-402c-9cbd-7099895f27a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.171s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:24 user nova-compute[71205]: DEBUG nova.compute.manager [req-68187b63-c874-49c3-a643-5f5253ee886a req-d4fbd00f-847c-40b1-89b5-b81a14fb45ac service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Received event network-vif-plugged-904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:20:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-68187b63-c874-49c3-a643-5f5253ee886a req-d4fbd00f-847c-40b1-89b5-b81a14fb45ac service nova] Acquiring lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:20:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-68187b63-c874-49c3-a643-5f5253ee886a req-d4fbd00f-847c-40b1-89b5-b81a14fb45ac service nova] Lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:20:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-68187b63-c874-49c3-a643-5f5253ee886a req-d4fbd00f-847c-40b1-89b5-b81a14fb45ac service nova] Lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:20:24 user nova-compute[71205]: DEBUG nova.compute.manager [req-68187b63-c874-49c3-a643-5f5253ee886a req-d4fbd00f-847c-40b1-89b5-b81a14fb45ac service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] No waiting events found dispatching network-vif-plugged-904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:20:24 user nova-compute[71205]: WARNING nova.compute.manager [req-68187b63-c874-49c3-a643-5f5253ee886a req-d4fbd00f-847c-40b1-89b5-b81a14fb45ac service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Received unexpected event network-vif-plugged-904cb737-715a-45a2-8c4e-e99e90596e6e for instance with vm_state active and task_state None. Apr 24 00:20:25 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:30 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:35 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:55 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:20:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:21:00 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:21:03 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:21:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:21:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:21:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:21:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:21:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:21:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:21:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:21:15 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:21:16 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:21:18 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:21:18 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:21:19 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:21:19 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:21:20 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:21:21 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:21:21 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:21:21 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:21:21 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=8912MB free_disk=26.62872314453125GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:21:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:21:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:21:21 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance cf2c88d5-8347-4166-a037-158f29c32d1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:21:21 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 88bc0884-174c-402c-9cbd-7099895f27a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:21:21 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:21:21 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:21:21 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:21:21 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:21:21 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:21:21 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:21:22 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:21:22 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:21:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:21:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:21:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:21:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:21:22 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:21:22 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lazy-loading 'info_cache' on Instance uuid cf2c88d5-8347-4166-a037-158f29c32d1a {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:21:23 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Updating instance_info_cache with network_info: [{"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.38", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:21:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:21:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:21:23 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:21:25 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:21:30 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:21:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:21:35 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:21:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:21:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:21:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:21:55 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:00 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:22:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:09 user nova-compute[71205]: DEBUG nova.compute.manager [req-a315d698-58d7-4576-b7f7-f868b6f4f160 req-1d92c2a9-f9a6-438a-a543-3a2278a7bb1d service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Received event network-changed-904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:22:09 user nova-compute[71205]: DEBUG nova.compute.manager [req-a315d698-58d7-4576-b7f7-f868b6f4f160 req-1d92c2a9-f9a6-438a-a543-3a2278a7bb1d service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Refreshing instance network info cache due to event network-changed-904cb737-715a-45a2-8c4e-e99e90596e6e. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:22:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-a315d698-58d7-4576-b7f7-f868b6f4f160 req-1d92c2a9-f9a6-438a-a543-3a2278a7bb1d service nova] Acquiring lock "refresh_cache-88bc0884-174c-402c-9cbd-7099895f27a0" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:22:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-a315d698-58d7-4576-b7f7-f868b6f4f160 req-1d92c2a9-f9a6-438a-a543-3a2278a7bb1d service nova] Acquired lock "refresh_cache-88bc0884-174c-402c-9cbd-7099895f27a0" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:22:09 user nova-compute[71205]: DEBUG nova.network.neutron [req-a315d698-58d7-4576-b7f7-f868b6f4f160 req-1d92c2a9-f9a6-438a-a543-3a2278a7bb1d service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Refreshing network info cache for port 904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:22:09 user nova-compute[71205]: DEBUG nova.network.neutron [req-a315d698-58d7-4576-b7f7-f868b6f4f160 req-1d92c2a9-f9a6-438a-a543-3a2278a7bb1d service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Updated VIF entry in instance network info cache for port 904cb737-715a-45a2-8c4e-e99e90596e6e. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:22:09 user nova-compute[71205]: DEBUG nova.network.neutron [req-a315d698-58d7-4576-b7f7-f868b6f4f160 req-1d92c2a9-f9a6-438a-a543-3a2278a7bb1d service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Updating instance_info_cache with network_info: [{"id": "904cb737-715a-45a2-8c4e-e99e90596e6e", "address": "fa:16:3e:d1:cc:51", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap904cb737-71", "ovs_interfaceid": "904cb737-715a-45a2-8c4e-e99e90596e6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:22:09 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-a315d698-58d7-4576-b7f7-f868b6f4f160 req-1d92c2a9-f9a6-438a-a543-3a2278a7bb1d service nova] Releasing lock "refresh_cache-88bc0884-174c-402c-9cbd-7099895f27a0" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Cleaning up deleted instances {{(pid=71205) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] There are 0 instances to clean {{(pid=71205) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "88bc0884-174c-402c-9cbd-7099895f27a0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "88bc0884-174c-402c-9cbd-7099895f27a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:22:10 user nova-compute[71205]: INFO nova.compute.manager [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Terminating instance Apr 24 00:22:10 user nova-compute[71205]: DEBUG nova.compute.manager [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG nova.compute.manager [req-299acf93-6cd9-44e9-af0d-7b8daab21fa3 req-2eefd53d-783f-4b5c-a959-862372cc63ad service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Received event network-vif-unplugged-904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-299acf93-6cd9-44e9-af0d-7b8daab21fa3 req-2eefd53d-783f-4b5c-a959-862372cc63ad service nova] Acquiring lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-299acf93-6cd9-44e9-af0d-7b8daab21fa3 req-2eefd53d-783f-4b5c-a959-862372cc63ad service nova] Lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-299acf93-6cd9-44e9-af0d-7b8daab21fa3 req-2eefd53d-783f-4b5c-a959-862372cc63ad service nova] Lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG nova.compute.manager [req-299acf93-6cd9-44e9-af0d-7b8daab21fa3 req-2eefd53d-783f-4b5c-a959-862372cc63ad service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] No waiting events found dispatching network-vif-unplugged-904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:22:10 user nova-compute[71205]: DEBUG nova.compute.manager [req-299acf93-6cd9-44e9-af0d-7b8daab21fa3 req-2eefd53d-783f-4b5c-a959-862372cc63ad service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Received event network-vif-unplugged-904cb737-715a-45a2-8c4e-e99e90596e6e for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:22:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:11 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Instance destroyed successfully. Apr 24 00:22:11 user nova-compute[71205]: DEBUG nova.objects.instance [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lazy-loading 'resources' on Instance uuid 88bc0884-174c-402c-9cbd-7099895f27a0 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:22:11 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:20:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-806357688',display_name='tempest-AttachVolumeNegativeTest-server-806357688',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-806357688',id=22,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIS7J2r3hvdb0rW9oq624HJm6iEzqAHNFXYccC2ziX4/UQM6xe9IVnyYzcYeVB3AHvem3MrTEL+AAJWg7X9ws6qIEWDFHnxd+tlDDo0bvhx6gq7aYfsNiTDpRmFhx0Gb2g==',key_name='tempest-keypair-1028661157',keypairs=,launch_index=0,launched_at=2023-04-24T00:20:24Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='97d1e8a757a746329ea363af81a3c6b4',ramdisk_id='',reservation_id='r-khqn9ba6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-272859998',owner_user_name='tempest-AttachVolumeNegativeTest-272859998-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:20:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35edcadbe77c4f4fa8304216e7f61d4a',uuid=88bc0884-174c-402c-9cbd-7099895f27a0,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "904cb737-715a-45a2-8c4e-e99e90596e6e", "address": "fa:16:3e:d1:cc:51", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap904cb737-71", "ovs_interfaceid": "904cb737-715a-45a2-8c4e-e99e90596e6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:22:11 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converting VIF {"id": "904cb737-715a-45a2-8c4e-e99e90596e6e", "address": "fa:16:3e:d1:cc:51", "network": {"id": "1a83eb44-caaf-46ff-8823-be4580052b3f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-883629256-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "97d1e8a757a746329ea363af81a3c6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap904cb737-71", "ovs_interfaceid": "904cb737-715a-45a2-8c4e-e99e90596e6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:22:11 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d1:cc:51,bridge_name='br-int',has_traffic_filtering=True,id=904cb737-715a-45a2-8c4e-e99e90596e6e,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap904cb737-71') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:22:11 user nova-compute[71205]: DEBUG os_vif [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:cc:51,bridge_name='br-int',has_traffic_filtering=True,id=904cb737-715a-45a2-8c4e-e99e90596e6e,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap904cb737-71') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:22:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap904cb737-71, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:22:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:22:11 user nova-compute[71205]: INFO os_vif [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:cc:51,bridge_name='br-int',has_traffic_filtering=True,id=904cb737-715a-45a2-8c4e-e99e90596e6e,network=Network(1a83eb44-caaf-46ff-8823-be4580052b3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap904cb737-71') Apr 24 00:22:11 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Deleting instance files /opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0_del Apr 24 00:22:11 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Deletion of /opt/stack/data/nova/instances/88bc0884-174c-402c-9cbd-7099895f27a0_del complete Apr 24 00:22:11 user nova-compute[71205]: INFO nova.compute.manager [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 24 00:22:11 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:22:11 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:22:11 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:22:12 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:22:12 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Took 1.36 seconds to deallocate network for instance. Apr 24 00:22:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:22:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:22:12 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:22:12 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:22:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:22:12 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Deleted allocations for instance 88bc0884-174c-402c-9cbd-7099895f27a0 Apr 24 00:22:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-33a1d966-bdd3-454d-bdf6-e00a453d939a tempest-AttachVolumeNegativeTest-272859998 tempest-AttachVolumeNegativeTest-272859998-project-member] Lock "88bc0884-174c-402c-9cbd-7099895f27a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.338s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:22:12 user nova-compute[71205]: DEBUG nova.compute.manager [req-e686d26b-2093-4ccd-b12e-0926dcc6312b req-d52e258c-ee17-42b7-a297-da0ce85dfdf8 service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Received event network-vif-plugged-904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:22:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-e686d26b-2093-4ccd-b12e-0926dcc6312b req-d52e258c-ee17-42b7-a297-da0ce85dfdf8 service nova] Acquiring lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:22:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-e686d26b-2093-4ccd-b12e-0926dcc6312b req-d52e258c-ee17-42b7-a297-da0ce85dfdf8 service nova] Lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:22:12 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-e686d26b-2093-4ccd-b12e-0926dcc6312b req-d52e258c-ee17-42b7-a297-da0ce85dfdf8 service nova] Lock "88bc0884-174c-402c-9cbd-7099895f27a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:22:12 user nova-compute[71205]: DEBUG nova.compute.manager [req-e686d26b-2093-4ccd-b12e-0926dcc6312b req-d52e258c-ee17-42b7-a297-da0ce85dfdf8 service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] No waiting events found dispatching network-vif-plugged-904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:22:12 user nova-compute[71205]: WARNING nova.compute.manager [req-e686d26b-2093-4ccd-b12e-0926dcc6312b req-d52e258c-ee17-42b7-a297-da0ce85dfdf8 service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Received unexpected event network-vif-plugged-904cb737-715a-45a2-8c4e-e99e90596e6e for instance with vm_state deleted and task_state None. Apr 24 00:22:12 user nova-compute[71205]: DEBUG nova.compute.manager [req-e686d26b-2093-4ccd-b12e-0926dcc6312b req-d52e258c-ee17-42b7-a297-da0ce85dfdf8 service nova] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Received event network-vif-deleted-904cb737-715a-45a2-8c4e-e99e90596e6e {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:22:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:16 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:22:19 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:22:19 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:22:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:22:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:22:20 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Cleaning up deleted instances with incomplete migration {{(pid=71205) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 24 00:22:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:22:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:22:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:22:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:22:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:21 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:22:21 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:22:21 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Didn't find any instances for network info cache update. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 24 00:22:21 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:22:21 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:22:21 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:22:22 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:22:22 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:22:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:22:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:22:22 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:22:22 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:22:22 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:22:22 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:22:22 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:22:22 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:22:23 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:22:23 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:22:23 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9105MB free_disk=26.647315979003906GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:22:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:22:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:22:23 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance cf2c88d5-8347-4166-a037-158f29c32d1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:22:23 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:22:23 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:22:23 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:22:23 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:22:23 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:22:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.194s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:22:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:22:26 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:22:26 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] VM Stopped (Lifecycle Event) Apr 24 00:22:26 user nova-compute[71205]: DEBUG nova.compute.manager [None req-e087e40e-c750-4f09-ac1f-3ac019747875 None None] [instance: 88bc0884-174c-402c-9cbd-7099895f27a0] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:22:26 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:22:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:22:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:22:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:22:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:22:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:22:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:22:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:51 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:22:51 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:51 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:22:51 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:22:51 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:22:51 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:22:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:01 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:01 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:18 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:23:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:21 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:23:21 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:23:21 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:23:22 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:23:22 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:23:22 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:23:22 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:23:23 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:23:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:23:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:23:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:23:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:23:23 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:23:23 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lazy-loading 'info_cache' on Instance uuid cf2c88d5-8347-4166-a037-158f29c32d1a {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:23:23 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Updating instance_info_cache with network_info: [{"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.38", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:23:23 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-cf2c88d5-8347-4166-a037-158f29c32d1a" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:23:23 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:23:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:23:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:23:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:23:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:23:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:23:24 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:23:24 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:23:24 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:23:24 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:23:24 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:23:25 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:23:25 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:23:25 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9074MB free_disk=26.63697052001953GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance cf2c88d5-8347-4166-a037-158f29c32d1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Refreshing inventories for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Updating ProviderTree inventory for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Updating inventory in ProviderTree for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Refreshing aggregate associations for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4, aggregates: None {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Refreshing trait associations for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4, traits: HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_DEVICE_TAGGING {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:23:25 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.404s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:23:26 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:23:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Acquiring lock "cf2c88d5-8347-4166-a037-158f29c32d1a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:23:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:23:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Acquiring lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:23:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:23:45 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:23:45 user nova-compute[71205]: INFO nova.compute.manager [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Terminating instance Apr 24 00:23:45 user nova-compute[71205]: DEBUG nova.compute.manager [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG nova.compute.manager [req-8de4c66c-ff8e-4fd8-aa3d-3d17192a2465 req-86c30afc-18d5-43b0-afdb-c86924fa7df3 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received event network-vif-unplugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-8de4c66c-ff8e-4fd8-aa3d-3d17192a2465 req-86c30afc-18d5-43b0-afdb-c86924fa7df3 service nova] Acquiring lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-8de4c66c-ff8e-4fd8-aa3d-3d17192a2465 req-86c30afc-18d5-43b0-afdb-c86924fa7df3 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-8de4c66c-ff8e-4fd8-aa3d-3d17192a2465 req-86c30afc-18d5-43b0-afdb-c86924fa7df3 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG nova.compute.manager [req-8de4c66c-ff8e-4fd8-aa3d-3d17192a2465 req-86c30afc-18d5-43b0-afdb-c86924fa7df3 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] No waiting events found dispatching network-vif-unplugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG nova.compute.manager [req-8de4c66c-ff8e-4fd8-aa3d-3d17192a2465 req-86c30afc-18d5-43b0-afdb-c86924fa7df3 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received event network-vif-unplugged-783a3713-64fb-48f3-b3ba-0312249006eb for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:46 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Instance destroyed successfully. Apr 24 00:23:46 user nova-compute[71205]: DEBUG nova.objects.instance [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lazy-loading 'resources' on Instance uuid cf2c88d5-8347-4166-a037-158f29c32d1a {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:14:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-75602253',display_name='tempest-ServerActionsTestJSON-server-75602253',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-75602253',id=17,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHBH505bABPeY47tqqwKUNL0j9rZei8h+xMYLIM0SjTR/WY833MZWOzZDU3xf+eKarpT1DDNfoU7YX1pgGnkISVtzLfDBcBBJeM+PXJDnAxAnfo3AHLlU/Wx1vdEG2Tqyg==',key_name='tempest-keypair-1442678964',keypairs=,launch_index=0,launched_at=2023-04-24T00:15:02Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='211ed190ee5c4a0b98b85960339ea437',ramdisk_id='',reservation_id='r-q1azt7t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerActionsTestJSON-1785200975',owner_user_name='tempest-ServerActionsTestJSON-1785200975-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:15:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8536b0b100af4c81b0114b37a10ec017',uuid=cf2c88d5-8347-4166-a037-158f29c32d1a,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.38", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Converting VIF {"id": "783a3713-64fb-48f3-b3ba-0312249006eb", "address": "fa:16:3e:87:78:10", "network": {"id": "551bb188-8d6d-484a-813f-5f76f0ec0e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1308396726-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.38", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "211ed190ee5c4a0b98b85960339ea437", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap783a3713-64", "ovs_interfaceid": "783a3713-64fb-48f3-b3ba-0312249006eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:87:78:10,bridge_name='br-int',has_traffic_filtering=True,id=783a3713-64fb-48f3-b3ba-0312249006eb,network=Network(551bb188-8d6d-484a-813f-5f76f0ec0e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap783a3713-64') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG os_vif [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:78:10,bridge_name='br-int',has_traffic_filtering=True,id=783a3713-64fb-48f3-b3ba-0312249006eb,network=Network(551bb188-8d6d-484a-813f-5f76f0ec0e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap783a3713-64') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap783a3713-64, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:23:46 user nova-compute[71205]: INFO os_vif [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:78:10,bridge_name='br-int',has_traffic_filtering=True,id=783a3713-64fb-48f3-b3ba-0312249006eb,network=Network(551bb188-8d6d-484a-813f-5f76f0ec0e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap783a3713-64') Apr 24 00:23:46 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Deleting instance files /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a_del Apr 24 00:23:46 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Deletion of /opt/stack/data/nova/instances/cf2c88d5-8347-4166-a037-158f29c32d1a_del complete Apr 24 00:23:46 user nova-compute[71205]: INFO nova.compute.manager [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Took 0.88 seconds to destroy the instance on the hypervisor. Apr 24 00:23:46 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:23:46 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:23:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:47 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:23:47 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Took 0.96 seconds to deallocate network for instance. Apr 24 00:23:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:23:47 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:23:47 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:23:47 user nova-compute[71205]: DEBUG nova.compute.manager [req-a2e53f91-3c77-45eb-a463-81b56a8d5f44 req-6dad18e6-8ecd-44c8-8636-18d8391d30d3 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received event network-vif-deleted-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:23:47 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.109s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:23:48 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Deleted allocations for instance cf2c88d5-8347-4166-a037-158f29c32d1a Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-4d1caed8-0fa9-4aa4-95ae-3083ef0d4348 tempest-ServerActionsTestJSON-1785200975 tempest-ServerActionsTestJSON-1785200975-project-member] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.144s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received event network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Acquiring lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] No waiting events found dispatching network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:23:48 user nova-compute[71205]: WARNING nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received unexpected event network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb for instance with vm_state deleted and task_state None. Apr 24 00:23:48 user nova-compute[71205]: DEBUG nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received event network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Acquiring lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] No waiting events found dispatching network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:23:48 user nova-compute[71205]: WARNING nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received unexpected event network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb for instance with vm_state deleted and task_state None. Apr 24 00:23:48 user nova-compute[71205]: DEBUG nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received event network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Acquiring lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] No waiting events found dispatching network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:23:48 user nova-compute[71205]: WARNING nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received unexpected event network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb for instance with vm_state deleted and task_state None. Apr 24 00:23:48 user nova-compute[71205]: DEBUG nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received event network-vif-unplugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Acquiring lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] No waiting events found dispatching network-vif-unplugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:23:48 user nova-compute[71205]: WARNING nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received unexpected event network-vif-unplugged-783a3713-64fb-48f3-b3ba-0312249006eb for instance with vm_state deleted and task_state None. Apr 24 00:23:48 user nova-compute[71205]: DEBUG nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received event network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Acquiring lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] Lock "cf2c88d5-8347-4166-a037-158f29c32d1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:23:48 user nova-compute[71205]: DEBUG nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] No waiting events found dispatching network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:23:48 user nova-compute[71205]: WARNING nova.compute.manager [req-bf2148f3-8019-4cb9-808d-bb908047a6be req-a2f0aa3b-7d77-4fc3-9c2b-9c4b48535e08 service nova] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Received unexpected event network-vif-plugged-783a3713-64fb-48f3-b3ba-0312249006eb for instance with vm_state deleted and task_state None. Apr 24 00:23:51 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:23:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:23:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:23:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:23:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:23:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:23:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:01 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:24:01 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] VM Stopped (Lifecycle Event) Apr 24 00:24:01 user nova-compute[71205]: DEBUG nova.compute.manager [None req-5983791f-1b23-4a2a-83d7-4ef5c1fe034b None None] [instance: cf2c88d5-8347-4166-a037-158f29c32d1a] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:24:01 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:24:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:24:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:24:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:24:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:24:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:19 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:24:21 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:24:22 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:24:22 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:24:22 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:24:23 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:24:23 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Didn't find any instances for network info cache update. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:24:24 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:24:24 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:24:24 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9261MB free_disk=26.655601501464844GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:24:24 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:24:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:24:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:24:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:24:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:24:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:24:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:32 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:42 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:51 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Acquiring lock "5d60a364-b914-4d2c-8fcc-5944148dcecf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:24:51 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:24:51 user nova-compute[71205]: DEBUG nova.compute.manager [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:24:51 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:24:51 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:24:51 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:24:51 user nova-compute[71205]: INFO nova.compute.claims [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Claim successful on node user Apr 24 00:24:51 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:24:51 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:24:51 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.178s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:24:51 user nova-compute[71205]: DEBUG nova.compute.manager [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG nova.compute.manager [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG nova.network.neutron [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:24:52 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:24:52 user nova-compute[71205]: DEBUG nova.compute.manager [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG nova.policy [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '61677897acda43ca83dd2c702e6e5eb5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffcdb2ab0e4145ad988f91b13ebd629e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG nova.compute.manager [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:24:52 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Creating image(s) Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Acquiring lock "/opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "/opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "/opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.006s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.145s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Acquiring lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.136s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk 1073741824" returned: 0 in 0.043s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "26d4c718c7a2a978d2022c858a570bbc0ccab5d8" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.186s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.131s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Checking if we can resize image /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Cannot resize image /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG nova.objects.instance [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lazy-loading 'migration_context' on Instance uuid 5d60a364-b914-4d2c-8fcc-5944148dcecf {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Ensure instance console log exists: /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:24:52 user nova-compute[71205]: DEBUG nova.network.neutron [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Successfully created port: a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:24:53 user nova-compute[71205]: DEBUG nova.network.neutron [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Successfully updated port: a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:24:53 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Acquiring lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:24:53 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Acquired lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:24:53 user nova-compute[71205]: DEBUG nova.network.neutron [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:24:53 user nova-compute[71205]: DEBUG nova.compute.manager [req-0a7b8ddd-3e72-47a9-b9a1-de99d3d72bfe req-5bf93a52-7ef2-489b-94a8-3afd28dbd9e4 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Received event network-changed-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:24:53 user nova-compute[71205]: DEBUG nova.compute.manager [req-0a7b8ddd-3e72-47a9-b9a1-de99d3d72bfe req-5bf93a52-7ef2-489b-94a8-3afd28dbd9e4 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Refreshing instance network info cache due to event network-changed-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:24:53 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0a7b8ddd-3e72-47a9-b9a1-de99d3d72bfe req-5bf93a52-7ef2-489b-94a8-3afd28dbd9e4 service nova] Acquiring lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:24:53 user nova-compute[71205]: DEBUG nova.network.neutron [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:24:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.network.neutron [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updating instance_info_cache with network_info: [{"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Releasing lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.compute.manager [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Instance network_info: |[{"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0a7b8ddd-3e72-47a9-b9a1-de99d3d72bfe req-5bf93a52-7ef2-489b-94a8-3afd28dbd9e4 service nova] Acquired lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.network.neutron [req-0a7b8ddd-3e72-47a9-b9a1-de99d3d72bfe req-5bf93a52-7ef2-489b-94a8-3afd28dbd9e4 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Refreshing network info cache for port a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Start _get_guest_xml network_info=[{"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fcf09ead-c5af-40cc-b5cf-92626e181ef9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:24:54 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:24:54 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:07:34Z,direct_url=,disk_format='qcow2',id=fcf09ead-c5af-40cc-b5cf-92626e181ef9,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='9619d6bf1eb94713942f61eb4a34f7c4',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:07:36Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:24:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-269082558',display_name='tempest-ServerStableDeviceRescueTest-server-269082558',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-269082558',id=23,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOIM4N11geMCFNlqf+WumRT8Io9+APZabKwlT29VAkxGMpvYD7hEgMyCcPGCln3NuH+chYQtwvDTqcXB35chovwequeYHVWAICGxNgGLzMB8oEjcCJdDttXbnQwU4HFcCw==',key_name='tempest-keypair-1862289859',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffcdb2ab0e4145ad988f91b13ebd629e',ramdisk_id='',reservation_id='r-x7hhkwnv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1726718806',owner_user_name='tempest-ServerStableDeviceRescueTest-1726718806-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:24:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='61677897acda43ca83dd2c702e6e5eb5',uuid=5d60a364-b914-4d2c-8fcc-5944148dcecf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Converting VIF {"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:ae:bb,bridge_name='br-int',has_traffic_filtering=True,id=a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de,network=Network(01988cc6-152f-4948-9028-cb1f71426e3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cfeb44-bd') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.objects.instance [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lazy-loading 'pci_devices' on Instance uuid 5d60a364-b914-4d2c-8fcc-5944148dcecf {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] End _get_guest_xml xml= Apr 24 00:24:54 user nova-compute[71205]: 5d60a364-b914-4d2c-8fcc-5944148dcecf Apr 24 00:24:54 user nova-compute[71205]: instance-00000017 Apr 24 00:24:54 user nova-compute[71205]: 131072 Apr 24 00:24:54 user nova-compute[71205]: 1 Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: tempest-ServerStableDeviceRescueTest-server-269082558 Apr 24 00:24:54 user nova-compute[71205]: 2023-04-24 00:24:54 Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: 128 Apr 24 00:24:54 user nova-compute[71205]: 1 Apr 24 00:24:54 user nova-compute[71205]: 0 Apr 24 00:24:54 user nova-compute[71205]: 0 Apr 24 00:24:54 user nova-compute[71205]: 1 Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: tempest-ServerStableDeviceRescueTest-1726718806-project-member Apr 24 00:24:54 user nova-compute[71205]: tempest-ServerStableDeviceRescueTest-1726718806 Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: OpenStack Foundation Apr 24 00:24:54 user nova-compute[71205]: OpenStack Nova Apr 24 00:24:54 user nova-compute[71205]: 0.0.0 Apr 24 00:24:54 user nova-compute[71205]: 5d60a364-b914-4d2c-8fcc-5944148dcecf Apr 24 00:24:54 user nova-compute[71205]: 5d60a364-b914-4d2c-8fcc-5944148dcecf Apr 24 00:24:54 user nova-compute[71205]: Virtual Machine Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: hvm Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Nehalem Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: /dev/urandom Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: Apr 24 00:24:54 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:24:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-269082558',display_name='tempest-ServerStableDeviceRescueTest-server-269082558',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-269082558',id=23,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOIM4N11geMCFNlqf+WumRT8Io9+APZabKwlT29VAkxGMpvYD7hEgMyCcPGCln3NuH+chYQtwvDTqcXB35chovwequeYHVWAICGxNgGLzMB8oEjcCJdDttXbnQwU4HFcCw==',key_name='tempest-keypair-1862289859',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ffcdb2ab0e4145ad988f91b13ebd629e',ramdisk_id='',reservation_id='r-x7hhkwnv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1726718806',owner_user_name='tempest-ServerStableDeviceRescueTest-1726718806-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:24:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='61677897acda43ca83dd2c702e6e5eb5',uuid=5d60a364-b914-4d2c-8fcc-5944148dcecf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Converting VIF {"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:ae:bb,bridge_name='br-int',has_traffic_filtering=True,id=a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de,network=Network(01988cc6-152f-4948-9028-cb1f71426e3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cfeb44-bd') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG os_vif [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:ae:bb,bridge_name='br-int',has_traffic_filtering=True,id=a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de,network=Network(01988cc6-152f-4948-9028-cb1f71426e3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cfeb44-bd') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa8cfeb44-bd, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa8cfeb44-bd, col_values=(('external_ids', {'iface-id': 'a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:ae:bb', 'vm-uuid': '5d60a364-b914-4d2c-8fcc-5944148dcecf'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:54 user nova-compute[71205]: INFO os_vif [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:ae:bb,bridge_name='br-int',has_traffic_filtering=True,id=a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de,network=Network(01988cc6-152f-4948-9028-cb1f71426e3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cfeb44-bd') Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] No VIF found with MAC fa:16:3e:9b:ae:bb, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.network.neutron [req-0a7b8ddd-3e72-47a9-b9a1-de99d3d72bfe req-5bf93a52-7ef2-489b-94a8-3afd28dbd9e4 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updated VIF entry in instance network info cache for port a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG nova.network.neutron [req-0a7b8ddd-3e72-47a9-b9a1-de99d3d72bfe req-5bf93a52-7ef2-489b-94a8-3afd28dbd9e4 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updating instance_info_cache with network_info: [{"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:24:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-0a7b8ddd-3e72-47a9-b9a1-de99d3d72bfe req-5bf93a52-7ef2-489b-94a8-3afd28dbd9e4 service nova] Releasing lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:24:55 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:55 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:55 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:55 user nova-compute[71205]: DEBUG nova.compute.manager [req-b4a1542f-385d-443f-813e-55e0f51d6032 req-ac983f8b-ad73-4e0a-b835-28cb7e1d8e87 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Received event network-vif-plugged-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:24:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b4a1542f-385d-443f-813e-55e0f51d6032 req-ac983f8b-ad73-4e0a-b835-28cb7e1d8e87 service nova] Acquiring lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:24:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b4a1542f-385d-443f-813e-55e0f51d6032 req-ac983f8b-ad73-4e0a-b835-28cb7e1d8e87 service nova] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:24:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-b4a1542f-385d-443f-813e-55e0f51d6032 req-ac983f8b-ad73-4e0a-b835-28cb7e1d8e87 service nova] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:24:55 user nova-compute[71205]: DEBUG nova.compute.manager [req-b4a1542f-385d-443f-813e-55e0f51d6032 req-ac983f8b-ad73-4e0a-b835-28cb7e1d8e87 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] No waiting events found dispatching network-vif-plugged-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:24:55 user nova-compute[71205]: WARNING nova.compute.manager [req-b4a1542f-385d-443f-813e-55e0f51d6032 req-ac983f8b-ad73-4e0a-b835-28cb7e1d8e87 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Received unexpected event network-vif-plugged-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de for instance with vm_state building and task_state spawning. Apr 24 00:24:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:24:57 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] VM Resumed (Lifecycle Event) Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.compute.manager [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:24:57 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Instance spawned successfully. Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.compute.manager [req-ccbd505e-575f-427a-9e5f-a1350a876285 req-52e454d0-0e10-4ce2-988b-70f51a8ca067 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Received event network-vif-plugged-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ccbd505e-575f-427a-9e5f-a1350a876285 req-52e454d0-0e10-4ce2-988b-70f51a8ca067 service nova] Acquiring lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ccbd505e-575f-427a-9e5f-a1350a876285 req-52e454d0-0e10-4ce2-988b-70f51a8ca067 service nova] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-ccbd505e-575f-427a-9e5f-a1350a876285 req-52e454d0-0e10-4ce2-988b-70f51a8ca067 service nova] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.compute.manager [req-ccbd505e-575f-427a-9e5f-a1350a876285 req-52e454d0-0e10-4ce2-988b-70f51a8ca067 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] No waiting events found dispatching network-vif-plugged-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:24:57 user nova-compute[71205]: WARNING nova.compute.manager [req-ccbd505e-575f-427a-9e5f-a1350a876285 req-52e454d0-0e10-4ce2-988b-70f51a8ca067 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Received unexpected event network-vif-plugged-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de for instance with vm_state building and task_state spawning. Apr 24 00:24:57 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:24:57 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] VM Started (Lifecycle Event) Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:24:57 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:24:57 user nova-compute[71205]: INFO nova.compute.manager [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Took 5.76 seconds to spawn the instance on the hypervisor. Apr 24 00:24:57 user nova-compute[71205]: DEBUG nova.compute.manager [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:24:58 user nova-compute[71205]: INFO nova.compute.manager [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Took 6.24 seconds to build instance. Apr 24 00:24:58 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-65d0ef01-4a23-42ff-be50-cc8c7cc2715c tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.346s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:24:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:24:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:03 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:25:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:25:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:25:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:25:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:25:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:20 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:25:23 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:25:23 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:25:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:25:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:25:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:25:24 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:25:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lazy-loading 'info_cache' on Instance uuid 5d60a364-b914-4d2c-8fcc-5944148dcecf {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updating instance_info_cache with network_info: [{"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:25:26 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:25:27 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:25:27 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:25:27 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:25:27 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:25:27 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:25:27 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9121MB free_disk=26.633441925048828GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:25:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:25:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:25:27 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 5d60a364-b914-4d2c-8fcc-5944148dcecf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:25:27 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:25:27 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:25:27 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:25:27 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:25:27 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:25:27 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.186s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:25:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:25:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:25:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:25:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:25:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:25:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:25:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:25:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:25:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:25:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:25:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:25:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:25:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:25:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:25:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:25:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:25:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:25:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:25:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:25:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:25:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:25:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:25:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:26:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:26:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:26:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:26:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:26:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:26:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:26:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:26:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:26:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:26:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:22 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:26:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:26:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:26:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:26:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:26:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:26:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:25 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:26:25 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:26:25 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:26:25 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:26:26 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:26:27 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lazy-loading 'info_cache' on Instance uuid 5d60a364-b914-4d2c-8fcc-5944148dcecf {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updating instance_info_cache with network_info: [{"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:26:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:26:29 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:26:29 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:26:29 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:26:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:29 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:26:29 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:26:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9158MB free_disk=26.63283920288086GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:26:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:26:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:26:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 5d60a364-b914-4d2c-8fcc-5944148dcecf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:26:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:26:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:26:29 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:26:29 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:26:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:26:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.165s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:26:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:26:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:26:42 user nova-compute[71205]: DEBUG nova.compute.manager [req-f53e834e-6668-4d07-b517-e96ac63508f9 req-3ab33c3f-7343-4b80-80d4-8ed0574bded5 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Received event network-changed-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:26:42 user nova-compute[71205]: DEBUG nova.compute.manager [req-f53e834e-6668-4d07-b517-e96ac63508f9 req-3ab33c3f-7343-4b80-80d4-8ed0574bded5 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Refreshing instance network info cache due to event network-changed-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:26:42 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f53e834e-6668-4d07-b517-e96ac63508f9 req-3ab33c3f-7343-4b80-80d4-8ed0574bded5 service nova] Acquiring lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:26:42 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f53e834e-6668-4d07-b517-e96ac63508f9 req-3ab33c3f-7343-4b80-80d4-8ed0574bded5 service nova] Acquired lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:26:42 user nova-compute[71205]: DEBUG nova.network.neutron [req-f53e834e-6668-4d07-b517-e96ac63508f9 req-3ab33c3f-7343-4b80-80d4-8ed0574bded5 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Refreshing network info cache for port a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:26:43 user nova-compute[71205]: DEBUG nova.network.neutron [req-f53e834e-6668-4d07-b517-e96ac63508f9 req-3ab33c3f-7343-4b80-80d4-8ed0574bded5 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updated VIF entry in instance network info cache for port a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:26:43 user nova-compute[71205]: DEBUG nova.network.neutron [req-f53e834e-6668-4d07-b517-e96ac63508f9 req-3ab33c3f-7343-4b80-80d4-8ed0574bded5 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updating instance_info_cache with network_info: [{"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.117", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:26:43 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-f53e834e-6668-4d07-b517-e96ac63508f9 req-3ab33c3f-7343-4b80-80d4-8ed0574bded5 service nova] Releasing lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:26:43 user nova-compute[71205]: DEBUG nova.compute.manager [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:26:43 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:43 user nova-compute[71205]: INFO nova.compute.manager [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] instance snapshotting Apr 24 00:26:43 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Beginning live snapshot process Apr 24 00:26:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json -f qcow2 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:26:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json -f qcow2" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:26:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json -f qcow2 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:26:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json -f qcow2" returned: 0 in 0.136s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:26:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:26:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8 --force-share --output=json" returned: 0 in 0.135s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:26:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp3esu8u_5/71e921ff55cd4c52b088b5a15612d694.delta 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:26:44 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/26d4c718c7a2a978d2022c858a570bbc0ccab5d8,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp3esu8u_5/71e921ff55cd4c52b088b5a15612d694.delta 1073741824" returned: 0 in 0.053s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:26:44 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Quiescing instance not available: QEMU guest agent is not enabled. Apr 24 00:26:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:45 user nova-compute[71205]: DEBUG nova.virt.libvirt.guest [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71205) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 24 00:26:45 user nova-compute[71205]: DEBUG nova.virt.libvirt.guest [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71205) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 24 00:26:45 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 24 00:26:45 user nova-compute[71205]: DEBUG nova.privsep.utils [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71205) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 24 00:26:45 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp3esu8u_5/71e921ff55cd4c52b088b5a15612d694.delta /opt/stack/data/nova/instances/snapshots/tmp3esu8u_5/71e921ff55cd4c52b088b5a15612d694 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:26:46 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp3esu8u_5/71e921ff55cd4c52b088b5a15612d694.delta /opt/stack/data/nova/instances/snapshots/tmp3esu8u_5/71e921ff55cd4c52b088b5a15612d694" returned: 0 in 0.386s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:26:46 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Snapshot extracted, beginning image upload Apr 24 00:26:48 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Snapshot image upload complete Apr 24 00:26:48 user nova-compute[71205]: INFO nova.compute.manager [None req-582a45ec-6935-4e66-95a0-e8ae28bf2e6e tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Took 4.56 seconds to snapshot the instance on the hypervisor. Apr 24 00:26:48 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:26:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:27:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:27:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:27:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:27:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:27:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:27:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:27:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:27:21 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:27:21 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Cleaning up deleted instances with incomplete migration {{(pid=71205) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 24 00:27:23 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:27:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:27:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:27:24 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Cleaning up deleted instances {{(pid=71205) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 24 00:27:24 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] There are 0 instances to clean {{(pid=71205) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 24 00:27:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:27:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:27:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:27:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:27:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:27:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:27:25 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:27:26 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:27:26 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:27:27 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:27:27 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:27:27 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:27:28 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:27:28 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:27:28 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:27:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:27:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:27:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:27:28 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:27:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:27:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:27:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:27:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:27:29 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:27:29 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:27:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9148MB free_disk=26.596019744873047GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:27:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:27:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:27:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 5d60a364-b914-4d2c-8fcc-5944148dcecf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:27:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:27:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:27:29 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:27:29 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:27:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:27:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.161s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:27:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:27:30 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:27:30 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:27:30 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:27:30 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:27:30 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:27:30 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:27:30 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lazy-loading 'info_cache' on Instance uuid 5d60a364-b914-4d2c-8fcc-5944148dcecf {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:27:30 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updating instance_info_cache with network_info: [{"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.117", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:27:30 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:27:30 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:27:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:27:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:27:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:27:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:27:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:27:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:27:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:27:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:27:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:27:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:27:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:27:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:27:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:27:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:27:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:27:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:27:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:27:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:27:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:27:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:27:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:27:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:27:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:27:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:27:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:27:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:28:04 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:28:04 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Triggering sync for uuid 5d60a364-b914-4d2c-8fcc-5944148dcecf {{(pid=71205) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 24 00:28:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "5d60a364-b914-4d2c-8fcc-5944148dcecf" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:28:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:28:04 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.026s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:28:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:28:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:28:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:28:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:28:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:28:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:28:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:28:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:26 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:28:26 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:28:27 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:28:27 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:28:28 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:28:28 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:28:28 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:28:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:28:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:28:28 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:28:28 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:28:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:28:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:28:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:28:28 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:28:29 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:28:29 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:28:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9157MB free_disk=26.59543228149414GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 5d60a364-b914-4d2c-8fcc-5944148dcecf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Refreshing inventories for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Updating ProviderTree inventory for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Updating inventory in ProviderTree for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Refreshing aggregate associations for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4, aggregates: None {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Refreshing trait associations for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4, traits: HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_DEVICE_TAGGING {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.392s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:28:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:30 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:28:31 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:28:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:28:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:28:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:28:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:28:31 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:28:31 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lazy-loading 'info_cache' on Instance uuid 5d60a364-b914-4d2c-8fcc-5944148dcecf {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:28:31 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updating instance_info_cache with network_info: [{"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.117", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:28:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-5d60a364-b914-4d2c-8fcc-5944148dcecf" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:28:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:28:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:28:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:28:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:28:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Acquiring lock "5d60a364-b914-4d2c-8fcc-5944148dcecf" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:28:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:28:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Acquiring lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:28:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:28:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:28:36 user nova-compute[71205]: INFO nova.compute.manager [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Terminating instance Apr 24 00:28:36 user nova-compute[71205]: DEBUG nova.compute.manager [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:28:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:28:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:28:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG nova.compute.manager [req-d5b5e18f-b31d-4f83-9440-d6b4f614dcee req-b7d852f6-b7ff-4a22-96db-e105b2725c43 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Received event network-vif-unplugged-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d5b5e18f-b31d-4f83-9440-d6b4f614dcee req-b7d852f6-b7ff-4a22-96db-e105b2725c43 service nova] Acquiring lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d5b5e18f-b31d-4f83-9440-d6b4f614dcee req-b7d852f6-b7ff-4a22-96db-e105b2725c43 service nova] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d5b5e18f-b31d-4f83-9440-d6b4f614dcee req-b7d852f6-b7ff-4a22-96db-e105b2725c43 service nova] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG nova.compute.manager [req-d5b5e18f-b31d-4f83-9440-d6b4f614dcee req-b7d852f6-b7ff-4a22-96db-e105b2725c43 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] No waiting events found dispatching network-vif-unplugged-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG nova.compute.manager [req-d5b5e18f-b31d-4f83-9440-d6b4f614dcee req-b7d852f6-b7ff-4a22-96db-e105b2725c43 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Received event network-vif-unplugged-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:28:37 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Instance destroyed successfully. Apr 24 00:28:37 user nova-compute[71205]: DEBUG nova.objects.instance [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lazy-loading 'resources' on Instance uuid 5d60a364-b914-4d2c-8fcc-5944148dcecf {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:24:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-269082558',display_name='tempest-ServerStableDeviceRescueTest-server-269082558',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-269082558',id=23,image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOIM4N11geMCFNlqf+WumRT8Io9+APZabKwlT29VAkxGMpvYD7hEgMyCcPGCln3NuH+chYQtwvDTqcXB35chovwequeYHVWAICGxNgGLzMB8oEjcCJdDttXbnQwU4HFcCw==',key_name='tempest-keypair-1862289859',keypairs=,launch_index=0,launched_at=2023-04-24T00:24:57Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='ffcdb2ab0e4145ad988f91b13ebd629e',ramdisk_id='',reservation_id='r-x7hhkwnv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fcf09ead-c5af-40cc-b5cf-92626e181ef9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerStableDeviceRescueTest-1726718806',owner_user_name='tempest-ServerStableDeviceRescueTest-1726718806-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:26:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='61677897acda43ca83dd2c702e6e5eb5',uuid=5d60a364-b914-4d2c-8fcc-5944148dcecf,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.117", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Converting VIF {"id": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "address": "fa:16:3e:9b:ae:bb", "network": {"id": "01988cc6-152f-4948-9028-cb1f71426e3c", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1758399776-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.117", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "ffcdb2ab0e4145ad988f91b13ebd629e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cfeb44-bd", "ovs_interfaceid": "a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9b:ae:bb,bridge_name='br-int',has_traffic_filtering=True,id=a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de,network=Network(01988cc6-152f-4948-9028-cb1f71426e3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cfeb44-bd') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG os_vif [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:ae:bb,bridge_name='br-int',has_traffic_filtering=True,id=a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de,network=Network(01988cc6-152f-4948-9028-cb1f71426e3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cfeb44-bd') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8cfeb44-bd, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:37 user nova-compute[71205]: INFO os_vif [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:ae:bb,bridge_name='br-int',has_traffic_filtering=True,id=a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de,network=Network(01988cc6-152f-4948-9028-cb1f71426e3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cfeb44-bd') Apr 24 00:28:37 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Deleting instance files /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf_del Apr 24 00:28:37 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Deletion of /opt/stack/data/nova/instances/5d60a364-b914-4d2c-8fcc-5944148dcecf_del complete Apr 24 00:28:37 user nova-compute[71205]: INFO nova.compute.manager [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 24 00:28:37 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:28:37 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:28:38 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:28:38 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Took 0.79 seconds to deallocate network for instance. Apr 24 00:28:38 user nova-compute[71205]: DEBUG nova.compute.manager [req-465b6b4b-0215-4f6a-a245-2da12a879b2c req-2f52b9a7-d2fe-4c8d-b596-b86499f8f334 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Received event network-vif-deleted-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:28:38 user nova-compute[71205]: INFO nova.compute.manager [req-465b6b4b-0215-4f6a-a245-2da12a879b2c req-2f52b9a7-d2fe-4c8d-b596-b86499f8f334 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Neutron deleted interface a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de; detaching it from the instance and deleting it from the info cache Apr 24 00:28:38 user nova-compute[71205]: DEBUG nova.network.neutron [req-465b6b4b-0215-4f6a-a245-2da12a879b2c req-2f52b9a7-d2fe-4c8d-b596-b86499f8f334 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:28:38 user nova-compute[71205]: DEBUG nova.compute.manager [req-465b6b4b-0215-4f6a-a245-2da12a879b2c req-2f52b9a7-d2fe-4c8d-b596-b86499f8f334 service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Detach interface failed, port_id=a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de, reason: Instance 5d60a364-b914-4d2c-8fcc-5944148dcecf could not be found. {{(pid=71205) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 24 00:28:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:28:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:28:38 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:28:38 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:28:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:28:38 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Deleted allocations for instance 5d60a364-b914-4d2c-8fcc-5944148dcecf Apr 24 00:28:38 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-de97c346-2837-426f-afc6-d0f9609e664a tempest-ServerStableDeviceRescueTest-1726718806 tempest-ServerStableDeviceRescueTest-1726718806-project-member] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.739s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:28:39 user nova-compute[71205]: DEBUG nova.compute.manager [req-9e45c319-8a3b-4641-9915-69b2c9522677 req-eea834f3-b17d-4718-9466-1907f352d48a service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Received event network-vif-plugged-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:28:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-9e45c319-8a3b-4641-9915-69b2c9522677 req-eea834f3-b17d-4718-9466-1907f352d48a service nova] Acquiring lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:28:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-9e45c319-8a3b-4641-9915-69b2c9522677 req-eea834f3-b17d-4718-9466-1907f352d48a service nova] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:28:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-9e45c319-8a3b-4641-9915-69b2c9522677 req-eea834f3-b17d-4718-9466-1907f352d48a service nova] Lock "5d60a364-b914-4d2c-8fcc-5944148dcecf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:28:39 user nova-compute[71205]: DEBUG nova.compute.manager [req-9e45c319-8a3b-4641-9915-69b2c9522677 req-eea834f3-b17d-4718-9466-1907f352d48a service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] No waiting events found dispatching network-vif-plugged-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:28:39 user nova-compute[71205]: WARNING nova.compute.manager [req-9e45c319-8a3b-4641-9915-69b2c9522677 req-eea834f3-b17d-4718-9466-1907f352d48a service nova] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Received unexpected event network-vif-plugged-a8cfeb44-bde7-4cc3-b957-c8db3f7fb3de for instance with vm_state deleted and task_state None. Apr 24 00:28:42 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:28:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:28:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:28:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:28:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:28:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:28:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:28:52 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:28:52 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] VM Stopped (Lifecycle Event) Apr 24 00:28:52 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d7a37ee0-6e4b-4522-84d3-d7eb213ffb45 None None] [instance: 5d60a364-b914-4d2c-8fcc-5944148dcecf] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:28:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:28:57 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:29:02 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:29:07 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:29:12 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:29:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:29:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:29:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:29:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:29:17 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:22 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:29:24 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:29:26 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:29:27 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:29:27 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:29:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:29:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:29:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:29:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:29:27 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:28 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:29:29 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:29:29 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:29:29 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:29:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:29:30 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:29:30 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:29:30 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9249MB free_disk=26.650009155273438GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:29:30 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.160s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:29:31 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:29:31 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:29:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:29:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:29:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Didn't find any instances for network info cache update. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 24 00:29:32 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:29:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:29:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:29:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:29:37 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:42 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:47 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:51 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:55 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:57 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:57 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:29:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:01 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:30:01 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:30:01 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:30:01 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:30:01 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:30:01 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:30:01 user nova-compute[71205]: INFO nova.compute.claims [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Claim successful on node user Apr 24 00:30:01 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:30:01 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.193s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:30:02 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:30:02 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG nova.policy [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e2ca48cbf8204d958685eaa42f6b6952', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c607157092ca4b0ea2d01c6fe7fb18c2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:30:02 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Creating image(s) Apr 24 00:30:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "/opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "/opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "/opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "866f3205742ceaebd560c130378cbabd1c6327c4" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "866f3205742ceaebd560c130378cbabd1c6327c4" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4.part --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4.part --force-share --output=json" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG nova.virt.images [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] 9ecdd39b-39aa-4c6d-a449-0f574c554082 was qcow2, converting to raw {{(pid=71205) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG nova.privsep.utils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71205) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4.part /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4.converted {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Successfully created port: cd924951-296b-43e1-b7d6-c7769ba90ee7 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4.part /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4.converted" returned: 0 in 0.253s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:30:02 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4.converted --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4.converted --force-share --output=json" returned: 0 in 0.137s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "866f3205742ceaebd560c130378cbabd1c6327c4" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.869s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4 --force-share --output=json" returned: 0 in 0.135s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "866f3205742ceaebd560c130378cbabd1c6327c4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "866f3205742ceaebd560c130378cbabd1c6327c4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4 --force-share --output=json" returned: 0 in 0.127s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4,backing_fmt=raw /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4,backing_fmt=raw /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk 1073741824" returned: 0 in 0.047s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "866f3205742ceaebd560c130378cbabd1c6327c4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.181s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/866f3205742ceaebd560c130378cbabd1c6327c4 --force-share --output=json" returned: 0 in 0.138s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Checking if we can resize image /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Successfully updated port: cd924951-296b-43e1-b7d6-c7769ba90ee7 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "refresh_cache-3441ea8c-1b7f-4db7-b26a-37c56edc2453" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquired lock "refresh_cache-3441ea8c-1b7f-4db7-b26a-37c56edc2453" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Cannot resize image /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.objects.instance [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lazy-loading 'migration_context' on Instance uuid 3441ea8c-1b7f-4db7-b26a-37c56edc2453 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Ensure instance console log exists: /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.compute.manager [req-e3bd7387-c8bd-4f5c-ac0a-3d131eb7fa5f req-cafc8e91-16d0-4a26-b416-fa6d6215f28d service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Received event network-changed-cd924951-296b-43e1-b7d6-c7769ba90ee7 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.compute.manager [req-e3bd7387-c8bd-4f5c-ac0a-3d131eb7fa5f req-cafc8e91-16d0-4a26-b416-fa6d6215f28d service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Refreshing instance network info cache due to event network-changed-cd924951-296b-43e1-b7d6-c7769ba90ee7. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-e3bd7387-c8bd-4f5c-ac0a-3d131eb7fa5f req-cafc8e91-16d0-4a26-b416-fa6d6215f28d service nova] Acquiring lock "refresh_cache-3441ea8c-1b7f-4db7-b26a-37c56edc2453" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.network.neutron [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Updating instance_info_cache with network_info: [{"id": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "address": "fa:16:3e:c1:ad:e8", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd924951-29", "ovs_interfaceid": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Releasing lock "refresh_cache-3441ea8c-1b7f-4db7-b26a-37c56edc2453" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Instance network_info: |[{"id": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "address": "fa:16:3e:c1:ad:e8", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd924951-29", "ovs_interfaceid": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-e3bd7387-c8bd-4f5c-ac0a-3d131eb7fa5f req-cafc8e91-16d0-4a26-b416-fa6d6215f28d service nova] Acquired lock "refresh_cache-3441ea8c-1b7f-4db7-b26a-37c56edc2453" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.network.neutron [req-e3bd7387-c8bd-4f5c-ac0a-3d131eb7fa5f req-cafc8e91-16d0-4a26-b416-fa6d6215f28d service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Refreshing network info cache for port cd924951-296b-43e1-b7d6-c7769ba90ee7 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:30:03 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Start _get_guest_xml network_info=[{"id": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "address": "fa:16:3e:c1:ad:e8", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd924951-29", "ovs_interfaceid": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:29:59Z,direct_url=,disk_format='qcow2',id=9ecdd39b-39aa-4c6d-a449-0f574c554082,min_disk=0,min_ram=0,name='tempest-scenario-img--778763031',owner='c607157092ca4b0ea2d01c6fe7fb18c2',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:30:00Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': '9ecdd39b-39aa-4c6d-a449-0f574c554082'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:30:04 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:30:04 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:29:59Z,direct_url=,disk_format='qcow2',id=9ecdd39b-39aa-4c6d-a449-0f574c554082,min_disk=0,min_ram=0,name='tempest-scenario-img--778763031',owner='c607157092ca4b0ea2d01c6fe7fb18c2',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:30:00Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:30:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1216480377',display_name='tempest-TestMinimumBasicScenario-server-1216480377',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1216480377',id=24,image_ref='9ecdd39b-39aa-4c6d-a449-0f574c554082',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFLDJJcBY83bYII2cz2UBFOs5JtscigjyJMQ6jSr5Zo8Gh64wLUW7PYGf39MocYkX0cAAMut0zrK8qkD0z8MC1LDRzLMpevf5gaOOpR4zhAODKbduqMVR4FvaWE7tF8Jag==',key_name='tempest-TestMinimumBasicScenario-1846912878',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c607157092ca4b0ea2d01c6fe7fb18c2',ramdisk_id='',reservation_id='r-537bb9n2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9ecdd39b-39aa-4c6d-a449-0f574c554082',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-776289633',owner_user_name='tempest-TestMinimumBasicScenario-776289633-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:30:02Z,user_data=None,user_id='e2ca48cbf8204d958685eaa42f6b6952',uuid=3441ea8c-1b7f-4db7-b26a-37c56edc2453,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "address": "fa:16:3e:c1:ad:e8", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd924951-29", "ovs_interfaceid": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Converting VIF {"id": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "address": "fa:16:3e:c1:ad:e8", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd924951-29", "ovs_interfaceid": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:ad:e8,bridge_name='br-int',has_traffic_filtering=True,id=cd924951-296b-43e1-b7d6-c7769ba90ee7,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd924951-29') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.objects.instance [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lazy-loading 'pci_devices' on Instance uuid 3441ea8c-1b7f-4db7-b26a-37c56edc2453 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] End _get_guest_xml xml= Apr 24 00:30:04 user nova-compute[71205]: 3441ea8c-1b7f-4db7-b26a-37c56edc2453 Apr 24 00:30:04 user nova-compute[71205]: instance-00000018 Apr 24 00:30:04 user nova-compute[71205]: 131072 Apr 24 00:30:04 user nova-compute[71205]: 1 Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: tempest-TestMinimumBasicScenario-server-1216480377 Apr 24 00:30:04 user nova-compute[71205]: 2023-04-24 00:30:04 Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: 128 Apr 24 00:30:04 user nova-compute[71205]: 1 Apr 24 00:30:04 user nova-compute[71205]: 0 Apr 24 00:30:04 user nova-compute[71205]: 0 Apr 24 00:30:04 user nova-compute[71205]: 1 Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: tempest-TestMinimumBasicScenario-776289633-project-member Apr 24 00:30:04 user nova-compute[71205]: tempest-TestMinimumBasicScenario-776289633 Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: OpenStack Foundation Apr 24 00:30:04 user nova-compute[71205]: OpenStack Nova Apr 24 00:30:04 user nova-compute[71205]: 0.0.0 Apr 24 00:30:04 user nova-compute[71205]: 3441ea8c-1b7f-4db7-b26a-37c56edc2453 Apr 24 00:30:04 user nova-compute[71205]: 3441ea8c-1b7f-4db7-b26a-37c56edc2453 Apr 24 00:30:04 user nova-compute[71205]: Virtual Machine Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: hvm Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Nehalem Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: /dev/urandom Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: Apr 24 00:30:04 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:30:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1216480377',display_name='tempest-TestMinimumBasicScenario-server-1216480377',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1216480377',id=24,image_ref='9ecdd39b-39aa-4c6d-a449-0f574c554082',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFLDJJcBY83bYII2cz2UBFOs5JtscigjyJMQ6jSr5Zo8Gh64wLUW7PYGf39MocYkX0cAAMut0zrK8qkD0z8MC1LDRzLMpevf5gaOOpR4zhAODKbduqMVR4FvaWE7tF8Jag==',key_name='tempest-TestMinimumBasicScenario-1846912878',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c607157092ca4b0ea2d01c6fe7fb18c2',ramdisk_id='',reservation_id='r-537bb9n2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9ecdd39b-39aa-4c6d-a449-0f574c554082',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-776289633',owner_user_name='tempest-TestMinimumBasicScenario-776289633-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:30:02Z,user_data=None,user_id='e2ca48cbf8204d958685eaa42f6b6952',uuid=3441ea8c-1b7f-4db7-b26a-37c56edc2453,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "address": "fa:16:3e:c1:ad:e8", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd924951-29", "ovs_interfaceid": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Converting VIF {"id": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "address": "fa:16:3e:c1:ad:e8", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd924951-29", "ovs_interfaceid": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:ad:e8,bridge_name='br-int',has_traffic_filtering=True,id=cd924951-296b-43e1-b7d6-c7769ba90ee7,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd924951-29') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG os_vif [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:ad:e8,bridge_name='br-int',has_traffic_filtering=True,id=cd924951-296b-43e1-b7d6-c7769ba90ee7,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd924951-29') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd924951-29, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcd924951-29, col_values=(('external_ids', {'iface-id': 'cd924951-296b-43e1-b7d6-c7769ba90ee7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:ad:e8', 'vm-uuid': '3441ea8c-1b7f-4db7-b26a-37c56edc2453'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:04 user nova-compute[71205]: INFO os_vif [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:ad:e8,bridge_name='br-int',has_traffic_filtering=True,id=cd924951-296b-43e1-b7d6-c7769ba90ee7,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd924951-29') Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:30:04 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] No VIF found with MAC fa:16:3e:c1:ad:e8, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:30:05 user nova-compute[71205]: DEBUG nova.network.neutron [req-e3bd7387-c8bd-4f5c-ac0a-3d131eb7fa5f req-cafc8e91-16d0-4a26-b416-fa6d6215f28d service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Updated VIF entry in instance network info cache for port cd924951-296b-43e1-b7d6-c7769ba90ee7. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:30:05 user nova-compute[71205]: DEBUG nova.network.neutron [req-e3bd7387-c8bd-4f5c-ac0a-3d131eb7fa5f req-cafc8e91-16d0-4a26-b416-fa6d6215f28d service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Updating instance_info_cache with network_info: [{"id": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "address": "fa:16:3e:c1:ad:e8", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd924951-29", "ovs_interfaceid": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:30:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-e3bd7387-c8bd-4f5c-ac0a-3d131eb7fa5f req-cafc8e91-16d0-4a26-b416-fa6d6215f28d service nova] Releasing lock "refresh_cache-3441ea8c-1b7f-4db7-b26a-37c56edc2453" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:30:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:05 user nova-compute[71205]: DEBUG nova.compute.manager [req-3624562e-6bdf-4eca-b320-abbb232f5565 req-97bfd309-65a0-470c-8abb-3a7bcb8d63e2 service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Received event network-vif-plugged-cd924951-296b-43e1-b7d6-c7769ba90ee7 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:30:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3624562e-6bdf-4eca-b320-abbb232f5565 req-97bfd309-65a0-470c-8abb-3a7bcb8d63e2 service nova] Acquiring lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:30:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3624562e-6bdf-4eca-b320-abbb232f5565 req-97bfd309-65a0-470c-8abb-3a7bcb8d63e2 service nova] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:30:05 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-3624562e-6bdf-4eca-b320-abbb232f5565 req-97bfd309-65a0-470c-8abb-3a7bcb8d63e2 service nova] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:30:05 user nova-compute[71205]: DEBUG nova.compute.manager [req-3624562e-6bdf-4eca-b320-abbb232f5565 req-97bfd309-65a0-470c-8abb-3a7bcb8d63e2 service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] No waiting events found dispatching network-vif-plugged-cd924951-296b-43e1-b7d6-c7769ba90ee7 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:30:05 user nova-compute[71205]: WARNING nova.compute.manager [req-3624562e-6bdf-4eca-b320-abbb232f5565 req-97bfd309-65a0-470c-8abb-3a7bcb8d63e2 service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Received unexpected event network-vif-plugged-cd924951-296b-43e1-b7d6-c7769ba90ee7 for instance with vm_state building and task_state spawning. Apr 24 00:30:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:30:07 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] VM Resumed (Lifecycle Event) Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:30:07 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Instance spawned successfully. Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:30:07 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:30:07 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] VM Started (Lifecycle Event) Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:30:07 user nova-compute[71205]: INFO nova.compute.manager [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Took 5.41 seconds to spawn the instance on the hypervisor. Apr 24 00:30:07 user nova-compute[71205]: DEBUG nova.compute.manager [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:30:07 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:30:07 user nova-compute[71205]: INFO nova.compute.manager [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Took 5.94 seconds to build instance. Apr 24 00:30:07 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-3ee209ee-5f76-49db-b046-e317978963b2 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.035s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:30:08 user nova-compute[71205]: DEBUG nova.compute.manager [req-d124a9cf-a0a7-4b57-8673-a6c0a881b5ad req-5724ebf9-3590-4c98-b892-c0feff7cb2c9 service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Received event network-vif-plugged-cd924951-296b-43e1-b7d6-c7769ba90ee7 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:30:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d124a9cf-a0a7-4b57-8673-a6c0a881b5ad req-5724ebf9-3590-4c98-b892-c0feff7cb2c9 service nova] Acquiring lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:30:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d124a9cf-a0a7-4b57-8673-a6c0a881b5ad req-5724ebf9-3590-4c98-b892-c0feff7cb2c9 service nova] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:30:08 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-d124a9cf-a0a7-4b57-8673-a6c0a881b5ad req-5724ebf9-3590-4c98-b892-c0feff7cb2c9 service nova] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:30:08 user nova-compute[71205]: DEBUG nova.compute.manager [req-d124a9cf-a0a7-4b57-8673-a6c0a881b5ad req-5724ebf9-3590-4c98-b892-c0feff7cb2c9 service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] No waiting events found dispatching network-vif-plugged-cd924951-296b-43e1-b7d6-c7769ba90ee7 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:30:08 user nova-compute[71205]: WARNING nova.compute.manager [req-d124a9cf-a0a7-4b57-8673-a6c0a881b5ad req-5724ebf9-3590-4c98-b892-c0feff7cb2c9 service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Received unexpected event network-vif-plugged-cd924951-296b-43e1-b7d6-c7769ba90ee7 for instance with vm_state active and task_state None. Apr 24 00:30:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:26 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:30:28 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:30:28 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:30:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:29 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:30:29 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:30:29 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:30:29 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:30:29 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:30:31 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:30:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:30:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:30:31 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:30:31 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:30:31 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:30:31 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:30:31 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:30:31 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:30:32 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:30:32 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:30:32 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9098MB free_disk=26.57260513305664GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:30:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:30:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:30:32 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 3441ea8c-1b7f-4db7-b26a-37c56edc2453 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:30:32 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:30:32 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:30:32 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:30:32 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:30:32 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:30:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.208s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:30:33 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:30:33 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:30:33 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:30:33 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-3441ea8c-1b7f-4db7-b26a-37c56edc2453" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:30:33 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-3441ea8c-1b7f-4db7-b26a-37c56edc2453" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:30:33 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:30:33 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lazy-loading 'info_cache' on Instance uuid 3441ea8c-1b7f-4db7-b26a-37c56edc2453 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:30:34 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Updating instance_info_cache with network_info: [{"id": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "address": "fa:16:3e:c1:ad:e8", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd924951-29", "ovs_interfaceid": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:30:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-3441ea8c-1b7f-4db7-b26a-37c56edc2453" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:30:34 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:30:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:30:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:30:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:30:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:30:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:30:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:30:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:30:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:30:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:30:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:31:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:31:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:31:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:31:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:31:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:31:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:31:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:31:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:31:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:31:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:31:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:31:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:31:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:31:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:31:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:31:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:28 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:31:28 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:29 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:31:30 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:31:30 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:31:30 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:31:30 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:31:30 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:31:31 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:31:32 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:31:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:31:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:31:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:31:32 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:31:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:31:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:31:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:31:32 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:31:33 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:31:33 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:31:33 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9141MB free_disk=26.571895599365234GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:31:33 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:31:33 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:31:33 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance 3441ea8c-1b7f-4db7-b26a-37c56edc2453 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:31:33 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:31:33 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:31:33 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:31:33 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:31:33 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:31:33 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:31:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:31:34 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:31:34 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:31:34 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:31:34 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:31:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-3441ea8c-1b7f-4db7-b26a-37c56edc2453" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:31:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-3441ea8c-1b7f-4db7-b26a-37c56edc2453" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:31:34 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:31:34 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lazy-loading 'info_cache' on Instance uuid 3441ea8c-1b7f-4db7-b26a-37c56edc2453 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:31:34 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Updating instance_info_cache with network_info: [{"id": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "address": "fa:16:3e:c1:ad:e8", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd924951-29", "ovs_interfaceid": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:31:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-3441ea8c-1b7f-4db7-b26a-37c56edc2453" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:31:34 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:31:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:31:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:31:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:31:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:31:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:31:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:31:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:31:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:31:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:31:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:31:54 user nova-compute[71205]: INFO nova.compute.manager [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Terminating instance Apr 24 00:31:54 user nova-compute[71205]: DEBUG nova.compute.manager [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG nova.compute.manager [req-6eba3bd7-6297-4c52-b8c0-432ef46222a7 req-100c2ee2-3614-4640-8e57-cb30d41a840e service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Received event network-vif-unplugged-cd924951-296b-43e1-b7d6-c7769ba90ee7 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-6eba3bd7-6297-4c52-b8c0-432ef46222a7 req-100c2ee2-3614-4640-8e57-cb30d41a840e service nova] Acquiring lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-6eba3bd7-6297-4c52-b8c0-432ef46222a7 req-100c2ee2-3614-4640-8e57-cb30d41a840e service nova] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-6eba3bd7-6297-4c52-b8c0-432ef46222a7 req-100c2ee2-3614-4640-8e57-cb30d41a840e service nova] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG nova.compute.manager [req-6eba3bd7-6297-4c52-b8c0-432ef46222a7 req-100c2ee2-3614-4640-8e57-cb30d41a840e service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] No waiting events found dispatching network-vif-unplugged-cd924951-296b-43e1-b7d6-c7769ba90ee7 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG nova.compute.manager [req-6eba3bd7-6297-4c52-b8c0-432ef46222a7 req-100c2ee2-3614-4640-8e57-cb30d41a840e service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Received event network-vif-unplugged-cd924951-296b-43e1-b7d6-c7769ba90ee7 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:54 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Instance destroyed successfully. Apr 24 00:31:54 user nova-compute[71205]: DEBUG nova.objects.instance [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lazy-loading 'resources' on Instance uuid 3441ea8c-1b7f-4db7-b26a-37c56edc2453 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:30:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1216480377',display_name='tempest-TestMinimumBasicScenario-server-1216480377',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1216480377',id=24,image_ref='9ecdd39b-39aa-4c6d-a449-0f574c554082',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFLDJJcBY83bYII2cz2UBFOs5JtscigjyJMQ6jSr5Zo8Gh64wLUW7PYGf39MocYkX0cAAMut0zrK8qkD0z8MC1LDRzLMpevf5gaOOpR4zhAODKbduqMVR4FvaWE7tF8Jag==',key_name='tempest-TestMinimumBasicScenario-1846912878',keypairs=,launch_index=0,launched_at=2023-04-24T00:30:07Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c607157092ca4b0ea2d01c6fe7fb18c2',ramdisk_id='',reservation_id='r-537bb9n2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9ecdd39b-39aa-4c6d-a449-0f574c554082',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-776289633',owner_user_name='tempest-TestMinimumBasicScenario-776289633-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:30:08Z,user_data=None,user_id='e2ca48cbf8204d958685eaa42f6b6952',uuid=3441ea8c-1b7f-4db7-b26a-37c56edc2453,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "address": "fa:16:3e:c1:ad:e8", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd924951-29", "ovs_interfaceid": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Converting VIF {"id": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "address": "fa:16:3e:c1:ad:e8", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd924951-29", "ovs_interfaceid": "cd924951-296b-43e1-b7d6-c7769ba90ee7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:ad:e8,bridge_name='br-int',has_traffic_filtering=True,id=cd924951-296b-43e1-b7d6-c7769ba90ee7,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd924951-29') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG os_vif [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:ad:e8,bridge_name='br-int',has_traffic_filtering=True,id=cd924951-296b-43e1-b7d6-c7769ba90ee7,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd924951-29') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd924951-29, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:31:54 user nova-compute[71205]: INFO os_vif [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:ad:e8,bridge_name='br-int',has_traffic_filtering=True,id=cd924951-296b-43e1-b7d6-c7769ba90ee7,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd924951-29') Apr 24 00:31:54 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Deleting instance files /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453_del Apr 24 00:31:54 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Deletion of /opt/stack/data/nova/instances/3441ea8c-1b7f-4db7-b26a-37c56edc2453_del complete Apr 24 00:31:54 user nova-compute[71205]: INFO nova.compute.manager [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 24 00:31:54 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:31:54 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:31:55 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:31:55 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Took 0.63 seconds to deallocate network for instance. Apr 24 00:31:55 user nova-compute[71205]: DEBUG nova.compute.manager [req-3dc684c5-fa0c-4907-b2b2-aff8be2bffe9 req-d60b185c-82c2-4b6e-be49-7270865b6e82 service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Received event network-vif-deleted-cd924951-296b-43e1-b7d6-c7769ba90ee7 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:31:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:31:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:31:55 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:31:55 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:31:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:31:55 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Deleted allocations for instance 3441ea8c-1b7f-4db7-b26a-37c56edc2453 Apr 24 00:31:55 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9ee08598-c5dc-4728-b10b-bb5f52187677 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.772s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:31:56 user nova-compute[71205]: DEBUG nova.compute.manager [req-a13cbadf-17d7-4c70-80de-832eb5fb0d16 req-bd94e17d-4a0d-4a1b-89f5-0b0761588e1f service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Received event network-vif-plugged-cd924951-296b-43e1-b7d6-c7769ba90ee7 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:31:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-a13cbadf-17d7-4c70-80de-832eb5fb0d16 req-bd94e17d-4a0d-4a1b-89f5-0b0761588e1f service nova] Acquiring lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:31:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-a13cbadf-17d7-4c70-80de-832eb5fb0d16 req-bd94e17d-4a0d-4a1b-89f5-0b0761588e1f service nova] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:31:56 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-a13cbadf-17d7-4c70-80de-832eb5fb0d16 req-bd94e17d-4a0d-4a1b-89f5-0b0761588e1f service nova] Lock "3441ea8c-1b7f-4db7-b26a-37c56edc2453-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:31:56 user nova-compute[71205]: DEBUG nova.compute.manager [req-a13cbadf-17d7-4c70-80de-832eb5fb0d16 req-bd94e17d-4a0d-4a1b-89f5-0b0761588e1f service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] No waiting events found dispatching network-vif-plugged-cd924951-296b-43e1-b7d6-c7769ba90ee7 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:31:56 user nova-compute[71205]: WARNING nova.compute.manager [req-a13cbadf-17d7-4c70-80de-832eb5fb0d16 req-bd94e17d-4a0d-4a1b-89f5-0b0761588e1f service nova] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Received unexpected event network-vif-plugged-cd924951-296b-43e1-b7d6-c7769ba90ee7 for instance with vm_state deleted and task_state None. Apr 24 00:31:59 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:32:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:32:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:32:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:32:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:32:04 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:09 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:32:09 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] VM Stopped (Lifecycle Event) Apr 24 00:32:09 user nova-compute[71205]: DEBUG nova.compute.manager [None req-d04187c9-c93e-456b-95b5-4037507f6529 None None] [instance: 3441ea8c-1b7f-4db7-b26a-37c56edc2453] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:32:09 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:32:14 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:32:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:32:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:32:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:32:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:32:19 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:24 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:32:29 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:32:29 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:32:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:32:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:32:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:32:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:32:29 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:30 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:32:30 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:32:30 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:32:30 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:32:30 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:32:30 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Cleaning up deleted instances {{(pid=71205) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 24 00:32:30 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] There are 0 instances to clean {{(pid=71205) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 24 00:32:31 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:32:31 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:32:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Cleaning up deleted instances with incomplete migration {{(pid=71205) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:32:32 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:32:32 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:32:32 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9232MB free_disk=26.590904235839844GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:32:32 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.156s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:32:34 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:32:35 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:32:35 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:32:35 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:32:35 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Didn't find any instances for network info cache update. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 24 00:32:39 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:44 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "b330e0bf-027f-4d61-ba28-93465a74b370" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "b330e0bf-027f-4d61-ba28-93465a74b370" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Starting instance... {{(pid=71205) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71205) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 24 00:32:48 user nova-compute[71205]: INFO nova.compute.claims [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Claim successful on node user Apr 24 00:32:48 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.180s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Start building networks asynchronously for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Allocating IP information in the background. {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG nova.network.neutron [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] allocate_for_instance() {{(pid=71205) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 24 00:32:48 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 24 00:32:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Start building block device mappings for instance. {{(pid=71205) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG nova.policy [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e2ca48cbf8204d958685eaa42f6b6952', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c607157092ca4b0ea2d01c6fe7fb18c2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71205) authorize /opt/stack/nova/nova/policy.py:203}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG nova.compute.manager [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Start spawning the instance on the hypervisor. {{(pid=71205) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Creating instance directory {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 24 00:32:48 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Creating image(s) Apr 24 00:32:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "/opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "/opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "/opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "3c0c63c2a562e999dfae9e97027ffc57ec4dd413" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "3c0c63c2a562e999dfae9e97027ffc57ec4dd413" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:32:48 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413.part --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413.part --force-share --output=json" returned: 0 in 0.136s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG nova.virt.images [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] fa4c3529-43a6-4316-88b2-b3bbed76c1f7 was qcow2, converting to raw {{(pid=71205) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG nova.privsep.utils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71205) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413.part /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413.converted {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413.part /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413.converted" returned: 0 in 0.154s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413.converted --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG nova.network.neutron [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Successfully created port: c8d80bab-94ba-4cad-aa6e-ff776b596163 {{(pid=71205) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413.converted --force-share --output=json" returned: 0 in 0.139s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "3c0c63c2a562e999dfae9e97027ffc57ec4dd413" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.729s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413 --force-share --output=json" returned: 0 in 0.128s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "3c0c63c2a562e999dfae9e97027ffc57ec4dd413" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "3c0c63c2a562e999dfae9e97027ffc57ec4dd413" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413 --force-share --output=json" returned: 0 in 0.127s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413,backing_fmt=raw /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk 1073741824 {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413,backing_fmt=raw /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk 1073741824" returned: 0 in 0.048s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "3c0c63c2a562e999dfae9e97027ffc57ec4dd413" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.182s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413 --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/3c0c63c2a562e999dfae9e97027ffc57ec4dd413 --force-share --output=json" returned: 0 in 0.129s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Checking if we can resize image /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk. size=1073741824 {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG nova.virt.disk.api [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Cannot resize image /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk to a smaller size. {{(pid=71205) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG nova.objects.instance [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lazy-loading 'migration_context' on Instance uuid b330e0bf-027f-4d61-ba28-93465a74b370 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Created local disks {{(pid=71205) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Ensure instance console log exists: /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/console.log {{(pid=71205) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:32:49 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.network.neutron [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Successfully updated port: c8d80bab-94ba-4cad-aa6e-ff776b596163 {{(pid=71205) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "refresh_cache-b330e0bf-027f-4d61-ba28-93465a74b370" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquired lock "refresh_cache-b330e0bf-027f-4d61-ba28-93465a74b370" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.network.neutron [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Building network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.compute.manager [req-4c72b616-234c-4600-83db-e2cec3fb907e req-621709ea-3710-4819-a095-252c357d8bc6 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Received event network-changed-c8d80bab-94ba-4cad-aa6e-ff776b596163 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.compute.manager [req-4c72b616-234c-4600-83db-e2cec3fb907e req-621709ea-3710-4819-a095-252c357d8bc6 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Refreshing instance network info cache due to event network-changed-c8d80bab-94ba-4cad-aa6e-ff776b596163. {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-4c72b616-234c-4600-83db-e2cec3fb907e req-621709ea-3710-4819-a095-252c357d8bc6 service nova] Acquiring lock "refresh_cache-b330e0bf-027f-4d61-ba28-93465a74b370" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.network.neutron [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Instance cache missing network info. {{(pid=71205) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.network.neutron [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Updating instance_info_cache with network_info: [{"id": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "address": "fa:16:3e:7d:dd:a7", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d80bab-94", "ovs_interfaceid": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Releasing lock "refresh_cache-b330e0bf-027f-4d61-ba28-93465a74b370" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.compute.manager [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Instance network_info: |[{"id": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "address": "fa:16:3e:7d:dd:a7", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d80bab-94", "ovs_interfaceid": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71205) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-4c72b616-234c-4600-83db-e2cec3fb907e req-621709ea-3710-4819-a095-252c357d8bc6 service nova] Acquired lock "refresh_cache-b330e0bf-027f-4d61-ba28-93465a74b370" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.network.neutron [req-4c72b616-234c-4600-83db-e2cec3fb907e req-621709ea-3710-4819-a095-252c357d8bc6 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Refreshing network info cache for port c8d80bab-94ba-4cad-aa6e-ff776b596163 {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Start _get_guest_xml network_info=[{"id": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "address": "fa:16:3e:7d:dd:a7", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d80bab-94", "ovs_interfaceid": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:32:45Z,direct_url=,disk_format='qcow2',id=fa4c3529-43a6-4316-88b2-b3bbed76c1f7,min_disk=0,min_ram=0,name='tempest-scenario-img--1184996871',owner='c607157092ca4b0ea2d01c6fe7fb18c2',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:32:47Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'fa4c3529-43a6-4316-88b2-b3bbed76c1f7'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 24 00:32:50 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:32:50 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71205) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-24T00:08:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-24T00:32:45Z,direct_url=,disk_format='qcow2',id=fa4c3529-43a6-4316-88b2-b3bbed76c1f7,min_disk=0,min_ram=0,name='tempest-scenario-img--1184996871',owner='c607157092ca4b0ea2d01c6fe7fb18c2',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-24T00:32:47Z,virtual_size=,visibility=), allow threads: True {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Flavor limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Image limits 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Flavor pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Image pref 0:0:0 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71205) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Got 1 possible topologies {{(pid=71205) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.hardware [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71205) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:32:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1411545761',display_name='tempest-TestMinimumBasicScenario-server-1411545761',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1411545761',id=25,image_ref='fa4c3529-43a6-4316-88b2-b3bbed76c1f7',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA/yBQ3OeQ6p3JWTMHIDZTKaDI1rEfdHuupYSVs84i1LbT47YEJ6RDAi4BJ9tHinujF4UyeJuVsDtyFCcsUXegAo8xzzOvG1iF59HUjKwHaDr5EsL0PxFO1OUkiouNZxPA==',key_name='tempest-TestMinimumBasicScenario-571855735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c607157092ca4b0ea2d01c6fe7fb18c2',ramdisk_id='',reservation_id='r-hwjbvx4w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fa4c3529-43a6-4316-88b2-b3bbed76c1f7',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-776289633',owner_user_name='tempest-TestMinimumBasicScenario-776289633-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:32:49Z,user_data=None,user_id='e2ca48cbf8204d958685eaa42f6b6952',uuid=b330e0bf-027f-4d61-ba28-93465a74b370,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "address": "fa:16:3e:7d:dd:a7", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d80bab-94", "ovs_interfaceid": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71205) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Converting VIF {"id": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "address": "fa:16:3e:7d:dd:a7", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d80bab-94", "ovs_interfaceid": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:dd:a7,bridge_name='br-int',has_traffic_filtering=True,id=c8d80bab-94ba-4cad-aa6e-ff776b596163,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d80bab-94') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.objects.instance [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lazy-loading 'pci_devices' on Instance uuid b330e0bf-027f-4d61-ba28-93465a74b370 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] End _get_guest_xml xml= Apr 24 00:32:50 user nova-compute[71205]: b330e0bf-027f-4d61-ba28-93465a74b370 Apr 24 00:32:50 user nova-compute[71205]: instance-00000019 Apr 24 00:32:50 user nova-compute[71205]: 131072 Apr 24 00:32:50 user nova-compute[71205]: 1 Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: tempest-TestMinimumBasicScenario-server-1411545761 Apr 24 00:32:50 user nova-compute[71205]: 2023-04-24 00:32:50 Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: 128 Apr 24 00:32:50 user nova-compute[71205]: 1 Apr 24 00:32:50 user nova-compute[71205]: 0 Apr 24 00:32:50 user nova-compute[71205]: 0 Apr 24 00:32:50 user nova-compute[71205]: 1 Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: tempest-TestMinimumBasicScenario-776289633-project-member Apr 24 00:32:50 user nova-compute[71205]: tempest-TestMinimumBasicScenario-776289633 Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: OpenStack Foundation Apr 24 00:32:50 user nova-compute[71205]: OpenStack Nova Apr 24 00:32:50 user nova-compute[71205]: 0.0.0 Apr 24 00:32:50 user nova-compute[71205]: b330e0bf-027f-4d61-ba28-93465a74b370 Apr 24 00:32:50 user nova-compute[71205]: b330e0bf-027f-4d61-ba28-93465a74b370 Apr 24 00:32:50 user nova-compute[71205]: Virtual Machine Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: hvm Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Nehalem Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: /dev/urandom Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: Apr 24 00:32:50 user nova-compute[71205]: {{(pid=71205) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:32:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1411545761',display_name='tempest-TestMinimumBasicScenario-server-1411545761',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1411545761',id=25,image_ref='fa4c3529-43a6-4316-88b2-b3bbed76c1f7',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA/yBQ3OeQ6p3JWTMHIDZTKaDI1rEfdHuupYSVs84i1LbT47YEJ6RDAi4BJ9tHinujF4UyeJuVsDtyFCcsUXegAo8xzzOvG1iF59HUjKwHaDr5EsL0PxFO1OUkiouNZxPA==',key_name='tempest-TestMinimumBasicScenario-571855735',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c607157092ca4b0ea2d01c6fe7fb18c2',ramdisk_id='',reservation_id='r-hwjbvx4w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fa4c3529-43a6-4316-88b2-b3bbed76c1f7',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-776289633',owner_user_name='tempest-TestMinimumBasicScenario-776289633-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-24T00:32:49Z,user_data=None,user_id='e2ca48cbf8204d958685eaa42f6b6952',uuid=b330e0bf-027f-4d61-ba28-93465a74b370,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "address": "fa:16:3e:7d:dd:a7", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d80bab-94", "ovs_interfaceid": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Converting VIF {"id": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "address": "fa:16:3e:7d:dd:a7", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d80bab-94", "ovs_interfaceid": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:dd:a7,bridge_name='br-int',has_traffic_filtering=True,id=c8d80bab-94ba-4cad-aa6e-ff776b596163,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d80bab-94') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG os_vif [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:dd:a7,bridge_name='br-int',has_traffic_filtering=True,id=c8d80bab-94ba-4cad-aa6e-ff776b596163,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d80bab-94') {{(pid=71205) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 17 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8d80bab-94, may_exist=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc8d80bab-94, col_values=(('external_ids', {'iface-id': 'c8d80bab-94ba-4cad-aa6e-ff776b596163', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:dd:a7', 'vm-uuid': 'b330e0bf-027f-4d61-ba28-93465a74b370'}),)) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:50 user nova-compute[71205]: INFO os_vif [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:dd:a7,bridge_name='br-int',has_traffic_filtering=True,id=c8d80bab-94ba-4cad-aa6e-ff776b596163,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d80bab-94') Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] No BDM found with device name vda, not building metadata. {{(pid=71205) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 24 00:32:50 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] No VIF found with MAC fa:16:3e:7d:dd:a7, not building metadata {{(pid=71205) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 24 00:32:51 user nova-compute[71205]: DEBUG nova.network.neutron [req-4c72b616-234c-4600-83db-e2cec3fb907e req-621709ea-3710-4819-a095-252c357d8bc6 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Updated VIF entry in instance network info cache for port c8d80bab-94ba-4cad-aa6e-ff776b596163. {{(pid=71205) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 24 00:32:51 user nova-compute[71205]: DEBUG nova.network.neutron [req-4c72b616-234c-4600-83db-e2cec3fb907e req-621709ea-3710-4819-a095-252c357d8bc6 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Updating instance_info_cache with network_info: [{"id": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "address": "fa:16:3e:7d:dd:a7", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d80bab-94", "ovs_interfaceid": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:32:51 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-4c72b616-234c-4600-83db-e2cec3fb907e req-621709ea-3710-4819-a095-252c357d8bc6 service nova] Releasing lock "refresh_cache-b330e0bf-027f-4d61-ba28-93465a74b370" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:32:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:52 user nova-compute[71205]: DEBUG nova.compute.manager [req-50eefd25-86a2-44a6-902b-c473d894b2cd req-05090fa2-52ad-4632-9e59-49785ac0ac14 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Received event network-vif-plugged-c8d80bab-94ba-4cad-aa6e-ff776b596163 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:32:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-50eefd25-86a2-44a6-902b-c473d894b2cd req-05090fa2-52ad-4632-9e59-49785ac0ac14 service nova] Acquiring lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:32:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-50eefd25-86a2-44a6-902b-c473d894b2cd req-05090fa2-52ad-4632-9e59-49785ac0ac14 service nova] Lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:32:52 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-50eefd25-86a2-44a6-902b-c473d894b2cd req-05090fa2-52ad-4632-9e59-49785ac0ac14 service nova] Lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:32:52 user nova-compute[71205]: DEBUG nova.compute.manager [req-50eefd25-86a2-44a6-902b-c473d894b2cd req-05090fa2-52ad-4632-9e59-49785ac0ac14 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] No waiting events found dispatching network-vif-plugged-c8d80bab-94ba-4cad-aa6e-ff776b596163 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:32:52 user nova-compute[71205]: WARNING nova.compute.manager [req-50eefd25-86a2-44a6-902b-c473d894b2cd req-05090fa2-52ad-4632-9e59-49785ac0ac14 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Received unexpected event network-vif-plugged-c8d80bab-94ba-4cad-aa6e-ff776b596163 for instance with vm_state building and task_state spawning. Apr 24 00:32:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:52 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:53 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Resumed> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:32:54 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] VM Resumed (Lifecycle Event) Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.compute.manager [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Instance event wait completed in 0 seconds for {{(pid=71205) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Guest created on hypervisor {{(pid=71205) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 24 00:32:54 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Instance spawned successfully. Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Found default for hw_cdrom_bus of ide {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Found default for hw_disk_bus of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Found default for hw_input_bus of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Found default for hw_pointer_model of None {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Found default for hw_video_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.virt.libvirt.driver [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Found default for hw_vif_model of virtio {{(pid=71205) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:32:54 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.virt.driver [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] Emitting event Started> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:32:54 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] VM Started (Lifecycle Event) Apr 24 00:32:54 user nova-compute[71205]: INFO nova.compute.manager [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Took 5.63 seconds to spawn the instance on the hypervisor. Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.compute.manager [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.compute.manager [req-633b30c5-2de4-4db6-afab-28d5906f97f6 req-c4cfc98c-56b6-4718-971e-5224a7f9f96d service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Received event network-vif-plugged-c8d80bab-94ba-4cad-aa6e-ff776b596163 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-633b30c5-2de4-4db6-afab-28d5906f97f6 req-c4cfc98c-56b6-4718-971e-5224a7f9f96d service nova] Acquiring lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-633b30c5-2de4-4db6-afab-28d5906f97f6 req-c4cfc98c-56b6-4718-971e-5224a7f9f96d service nova] Lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-633b30c5-2de4-4db6-afab-28d5906f97f6 req-c4cfc98c-56b6-4718-971e-5224a7f9f96d service nova] Lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.compute.manager [req-633b30c5-2de4-4db6-afab-28d5906f97f6 req-c4cfc98c-56b6-4718-971e-5224a7f9f96d service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] No waiting events found dispatching network-vif-plugged-c8d80bab-94ba-4cad-aa6e-ff776b596163 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:32:54 user nova-compute[71205]: WARNING nova.compute.manager [req-633b30c5-2de4-4db6-afab-28d5906f97f6 req-c4cfc98c-56b6-4718-971e-5224a7f9f96d service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Received unexpected event network-vif-plugged-c8d80bab-94ba-4cad-aa6e-ff776b596163 for instance with vm_state building and task_state spawning. Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:32:54 user nova-compute[71205]: DEBUG nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71205) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 24 00:32:54 user nova-compute[71205]: INFO nova.compute.manager [None req-75b15b8b-fbdc-48e1-bc23-e82bf05be0a0 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] During sync_power_state the instance has a pending task (spawning). Skip. Apr 24 00:32:54 user nova-compute[71205]: INFO nova.compute.manager [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Took 6.17 seconds to build instance. Apr 24 00:32:54 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-9be327b3-3aae-443a-ac0a-82906860f34c tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "b330e0bf-027f-4d61-ba28-93465a74b370" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.275s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:32:55 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:32:58 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:00 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:33:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:13 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:15 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:18 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:23 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:25 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:29 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:33:30 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:31 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:33:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:33:32 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:33:32 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:33:32 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:33:32 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:33:33 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:33:33 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:33:33 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:33:33 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:33:33 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:33:33 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:33:33 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:33:33 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:33:33 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:33:33 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk --force-share --output=json" returned: 0 in 0.168s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:33:34 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:33:34 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:33:34 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9133MB free_disk=26.55356216430664GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance b330e0bf-027f-4d61-ba28-93465a74b370 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Refreshing inventories for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Updating ProviderTree inventory for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Updating inventory in ProviderTree for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Refreshing aggregate associations for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4, aggregates: None {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Refreshing trait associations for resource provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4, traits: HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_DEVICE_TAGGING {{(pid=71205) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:33:34 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.387s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:33:35 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:33:35 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:36 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:33:36 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:33:36 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:33:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-b330e0bf-027f-4d61-ba28-93465a74b370" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:33:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-b330e0bf-027f-4d61-ba28-93465a74b370" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:33:36 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:33:36 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lazy-loading 'info_cache' on Instance uuid b330e0bf-027f-4d61-ba28-93465a74b370 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:33:36 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Updating instance_info_cache with network_info: [{"id": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "address": "fa:16:3e:7d:dd:a7", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d80bab-94", "ovs_interfaceid": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:33:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-b330e0bf-027f-4d61-ba28-93465a74b370" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:33:36 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:33:38 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:45 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:33:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:33:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:33:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:33:50 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:33:55 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:34:00 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:34:05 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:34:08 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:34:10 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:34:15 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:34:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:34:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:34:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:34:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:34:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:34:20 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:34:25 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:34:30 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:34:30 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:34:31 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:34:31 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:34:32 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:34:33 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:34:33 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:34:34 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:34:34 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:34:35 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:34:35 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:34:35 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:34:35 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:34:35 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:34:35 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:34:35 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:34:35 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk --force-share --output=json {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 24 00:34:35 user nova-compute[71205]: DEBUG oslo_concurrency.processutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71205) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 24 00:34:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:34:36 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:34:36 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:34:36 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9147MB free_disk=26.552867889404297GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:34:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:34:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:34:36 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Instance b330e0bf-027f-4d61-ba28-93465a74b370 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71205) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 24 00:34:36 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:34:36 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:34:36 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:34:36 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:34:36 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:34:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:34:39 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:34:39 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:34:39 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:34:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "refresh_cache-b330e0bf-027f-4d61-ba28-93465a74b370" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 24 00:34:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquired lock "refresh_cache-b330e0bf-027f-4d61-ba28-93465a74b370" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 24 00:34:39 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Forcefully refreshing network info cache for instance {{(pid=71205) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 24 00:34:39 user nova-compute[71205]: DEBUG nova.objects.instance [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lazy-loading 'info_cache' on Instance uuid b330e0bf-027f-4d61-ba28-93465a74b370 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:34:39 user nova-compute[71205]: DEBUG nova.network.neutron [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Updating instance_info_cache with network_info: [{"id": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "address": "fa:16:3e:7d:dd:a7", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d80bab-94", "ovs_interfaceid": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:34:39 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Releasing lock "refresh_cache-b330e0bf-027f-4d61-ba28-93465a74b370" {{(pid=71205) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 24 00:34:39 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Updated the network info_cache for instance {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "b330e0bf-027f-4d61-ba28-93465a74b370" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "b330e0bf-027f-4d61-ba28-93465a74b370" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:34:40 user nova-compute[71205]: INFO nova.compute.manager [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Terminating instance Apr 24 00:34:40 user nova-compute[71205]: DEBUG nova.compute.manager [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Start destroying the instance on the hypervisor. {{(pid=71205) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG nova.compute.manager [req-c6fe540a-ab54-41d6-a0d8-09e7e95d8418 req-c41a0e0e-c5df-4163-a9d9-2dd0d287af85 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Received event network-vif-unplugged-c8d80bab-94ba-4cad-aa6e-ff776b596163 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c6fe540a-ab54-41d6-a0d8-09e7e95d8418 req-c41a0e0e-c5df-4163-a9d9-2dd0d287af85 service nova] Acquiring lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c6fe540a-ab54-41d6-a0d8-09e7e95d8418 req-c41a0e0e-c5df-4163-a9d9-2dd0d287af85 service nova] Lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-c6fe540a-ab54-41d6-a0d8-09e7e95d8418 req-c41a0e0e-c5df-4163-a9d9-2dd0d287af85 service nova] Lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG nova.compute.manager [req-c6fe540a-ab54-41d6-a0d8-09e7e95d8418 req-c41a0e0e-c5df-4163-a9d9-2dd0d287af85 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] No waiting events found dispatching network-vif-unplugged-c8d80bab-94ba-4cad-aa6e-ff776b596163 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:34:40 user nova-compute[71205]: DEBUG nova.compute.manager [req-c6fe540a-ab54-41d6-a0d8-09e7e95d8418 req-c41a0e0e-c5df-4163-a9d9-2dd0d287af85 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Received event network-vif-unplugged-c8d80bab-94ba-4cad-aa6e-ff776b596163 for instance with task_state deleting. {{(pid=71205) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:34:41 user nova-compute[71205]: INFO nova.virt.libvirt.driver [-] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Instance destroyed successfully. Apr 24 00:34:41 user nova-compute[71205]: DEBUG nova.objects.instance [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lazy-loading 'resources' on Instance uuid b330e0bf-027f-4d61-ba28-93465a74b370 {{(pid=71205) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG nova.virt.libvirt.vif [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-24T00:32:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1411545761',display_name='tempest-TestMinimumBasicScenario-server-1411545761',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1411545761',id=25,image_ref='fa4c3529-43a6-4316-88b2-b3bbed76c1f7',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA/yBQ3OeQ6p3JWTMHIDZTKaDI1rEfdHuupYSVs84i1LbT47YEJ6RDAi4BJ9tHinujF4UyeJuVsDtyFCcsUXegAo8xzzOvG1iF59HUjKwHaDr5EsL0PxFO1OUkiouNZxPA==',key_name='tempest-TestMinimumBasicScenario-571855735',keypairs=,launch_index=0,launched_at=2023-04-24T00:32:54Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c607157092ca4b0ea2d01c6fe7fb18c2',ramdisk_id='',reservation_id='r-hwjbvx4w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='fa4c3529-43a6-4316-88b2-b3bbed76c1f7',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-776289633',owner_user_name='tempest-TestMinimumBasicScenario-776289633-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-24T00:32:54Z,user_data=None,user_id='e2ca48cbf8204d958685eaa42f6b6952',uuid=b330e0bf-027f-4d61-ba28-93465a74b370,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "address": "fa:16:3e:7d:dd:a7", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d80bab-94", "ovs_interfaceid": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Converting VIF {"id": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "address": "fa:16:3e:7d:dd:a7", "network": {"id": "bbd390a3-499e-4b86-9c48-401b14864b06", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-337104155-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c607157092ca4b0ea2d01c6fe7fb18c2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8d80bab-94", "ovs_interfaceid": "c8d80bab-94ba-4cad-aa6e-ff776b596163", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG nova.network.os_vif_util [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7d:dd:a7,bridge_name='br-int',has_traffic_filtering=True,id=c8d80bab-94ba-4cad-aa6e-ff776b596163,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d80bab-94') {{(pid=71205) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG os_vif [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:dd:a7,bridge_name='br-int',has_traffic_filtering=True,id=c8d80bab-94ba-4cad-aa6e-ff776b596163,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d80bab-94') {{(pid=71205) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8d80bab-94, bridge=br-int, if_exists=True) {{(pid=71205) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:34:41 user nova-compute[71205]: INFO os_vif [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:dd:a7,bridge_name='br-int',has_traffic_filtering=True,id=c8d80bab-94ba-4cad-aa6e-ff776b596163,network=Network(bbd390a3-499e-4b86-9c48-401b14864b06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8d80bab-94') Apr 24 00:34:41 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Deleting instance files /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370_del Apr 24 00:34:41 user nova-compute[71205]: INFO nova.virt.libvirt.driver [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Deletion of /opt/stack/data/nova/instances/b330e0bf-027f-4d61-ba28-93465a74b370_del complete Apr 24 00:34:41 user nova-compute[71205]: INFO nova.compute.manager [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 24 00:34:41 user nova-compute[71205]: DEBUG oslo.service.loopingcall [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71205) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG nova.compute.manager [-] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Deallocating network for instance {{(pid=71205) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] deallocate_for_instance() {{(pid=71205) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG nova.network.neutron [-] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:34:41 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Took 0.62 seconds to deallocate network for instance. Apr 24 00:34:41 user nova-compute[71205]: DEBUG nova.compute.manager [req-9684334b-d412-4973-b381-e94bf76e92a3 req-7e2379c1-a228-4f57-91db-574fbe9d7b01 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Received event network-vif-deleted-c8d80bab-94ba-4cad-aa6e-ff776b596163 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:34:41 user nova-compute[71205]: INFO nova.compute.manager [req-9684334b-d412-4973-b381-e94bf76e92a3 req-7e2379c1-a228-4f57-91db-574fbe9d7b01 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Neutron deleted interface c8d80bab-94ba-4cad-aa6e-ff776b596163; detaching it from the instance and deleting it from the info cache Apr 24 00:34:41 user nova-compute[71205]: DEBUG nova.network.neutron [req-9684334b-d412-4973-b381-e94bf76e92a3 req-7e2379c1-a228-4f57-91db-574fbe9d7b01 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Updating instance_info_cache with network_info: [] {{(pid=71205) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG nova.compute.manager [req-9684334b-d412-4973-b381-e94bf76e92a3 req-7e2379c1-a228-4f57-91db-574fbe9d7b01 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Detach interface failed, port_id=c8d80bab-94ba-4cad-aa6e-ff776b596163, reason: Instance b330e0bf-027f-4d61-ba28-93465a74b370 could not be found. {{(pid=71205) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:34:41 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:34:42 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.109s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:34:42 user nova-compute[71205]: INFO nova.scheduler.client.report [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Deleted allocations for instance b330e0bf-027f-4d61-ba28-93465a74b370 Apr 24 00:34:42 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-91e2e752-1f1b-4eaa-840b-935c0ab25652 tempest-TestMinimumBasicScenario-776289633 tempest-TestMinimumBasicScenario-776289633-project-member] Lock "b330e0bf-027f-4d61-ba28-93465a74b370" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.593s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:34:42 user nova-compute[71205]: DEBUG nova.compute.manager [req-932939e0-0e9e-4e1d-bb51-47493ae7dce7 req-2a4ed8a6-5a60-4fef-ae23-613ddff7d5b0 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Received event network-vif-plugged-c8d80bab-94ba-4cad-aa6e-ff776b596163 {{(pid=71205) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 24 00:34:42 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-932939e0-0e9e-4e1d-bb51-47493ae7dce7 req-2a4ed8a6-5a60-4fef-ae23-613ddff7d5b0 service nova] Acquiring lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:34:42 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-932939e0-0e9e-4e1d-bb51-47493ae7dce7 req-2a4ed8a6-5a60-4fef-ae23-613ddff7d5b0 service nova] Lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:34:42 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [req-932939e0-0e9e-4e1d-bb51-47493ae7dce7 req-2a4ed8a6-5a60-4fef-ae23-613ddff7d5b0 service nova] Lock "b330e0bf-027f-4d61-ba28-93465a74b370-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:34:42 user nova-compute[71205]: DEBUG nova.compute.manager [req-932939e0-0e9e-4e1d-bb51-47493ae7dce7 req-2a4ed8a6-5a60-4fef-ae23-613ddff7d5b0 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] No waiting events found dispatching network-vif-plugged-c8d80bab-94ba-4cad-aa6e-ff776b596163 {{(pid=71205) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 24 00:34:42 user nova-compute[71205]: WARNING nova.compute.manager [req-932939e0-0e9e-4e1d-bb51-47493ae7dce7 req-2a4ed8a6-5a60-4fef-ae23-613ddff7d5b0 service nova] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Received unexpected event network-vif-plugged-c8d80bab-94ba-4cad-aa6e-ff776b596163 for instance with vm_state deleted and task_state None. Apr 24 00:34:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:34:51 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:34:56 user nova-compute[71205]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71205) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 24 00:34:56 user nova-compute[71205]: INFO nova.compute.manager [-] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] VM Stopped (Lifecycle Event) Apr 24 00:34:56 user nova-compute[71205]: DEBUG nova.compute.manager [None req-fcb050f9-aa02-4f84-88b3-e196abeb15ed None None] [instance: b330e0bf-027f-4d61-ba28-93465a74b370] Checking state {{(pid=71205) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 24 00:34:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:34:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:34:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:34:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:34:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:34:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:35:01 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:35:06 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:35:11 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:35:16 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:35:21 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:35:26 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:35:30 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:35:31 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:35:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:35:33 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:35:33 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:35:33 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:35:33 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71205) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 24 00:35:34 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:35:34 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:35:35 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:35:35 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Auditing locally available compute resources for user (node: user) {{(pid=71205) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 24 00:35:36 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:35:36 user nova-compute[71205]: WARNING nova.virt.libvirt.driver [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 24 00:35:36 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Hypervisor/Node resource view: name=user free_ram=9242MB free_disk=26.594017028808594GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}] {{(pid=71205) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71205) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG nova.compute.provider_tree [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed in ProviderTree for provider: 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 {{(pid=71205) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG nova.scheduler.client.report [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Inventory has not changed for provider 67b4ed56-aae5-4ace-a9ba-cc832bf2fdf4 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71205) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG nova.compute.resource_tracker [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Compute_service record updated for user:user {{(pid=71205) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 24 00:35:36 user nova-compute[71205]: DEBUG oslo_concurrency.lockutils [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.136s {{(pid=71205) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 24 00:35:37 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:35:40 user nova-compute[71205]: DEBUG oslo_service.periodic_task [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71205) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 24 00:35:40 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Starting heal instance info cache {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 24 00:35:40 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Rebuilding the list of instances to heal {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 24 00:35:40 user nova-compute[71205]: DEBUG nova.compute.manager [None req-6f7e8b73-6aa3-4c79-a52f-9b9014d091d1 None None] Didn't find any instances for network info cache update. {{(pid=71205) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 24 00:35:41 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:35:46 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:35:51 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:35:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 24 00:35:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 24 00:35:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71205) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 24 00:35:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:35:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71205) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 24 00:35:56 user nova-compute[71205]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71205) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}}